How Training & Professional Development will save us from AI

Artificial Intelligence is not the end of work

I'm using an intentionally inflammatory title because I feel the Luddite-hype around artificial intelligence is driven by fear, ignorance, and a lack of exposure to sci-fi as a body of literature. I am a proud nerd; I am a voracious sci-fi reader (listener if I'm honest, I love audiobooks); I even play Dungeons & Dragons once a week with a gaming group. I used to take the flak for this when I was a teenager, but as it turns out, the nerds inherited the Earth if the prominence of the tech sector is any indication. As the sci-fi genre amply demonstrates, the nerd-herd has been in love with AI for a long time, and it's coming whether or not people are emotionally ready to face it. However, we are a long way away from seeing Data from Star Trek sitting down at a team centuries away.

Fear is the mind killer

Despite being a long way away from Skynet and other AI-boogie men from sci-fi, there are some adults --who are otherwise reasonable and intelligent individuals-- who are losing their minds thinking that we are mere decades away from being used as a power source by the robots that will take over the Earth. Having grown up with empty fear-mongering about Latino immigrants taking jobs away from Americans, the fear-mongering that robots are going to do the same thing sounds equally hollow to me. The robots are not coming to take your job, so please take a deep breath and relax. Breath in relaxation and breath out the negativity. Take a moment to visit your happy place.

It's essential not to allow our fears to govern our thinking in this space because people are prone to exaggerating those fears when it confirms a personal bias. To adapt to these changes we know are coming, we have to be grounded in facts and not fiction. Celebrated sci-fi author Frank Herbert made the following observation in one of his most celebrated works.

Fear is the mind killer.
— Dune, 1965

Now that we’ve taken that deep breath and made that trip to the happy place, let's talk about what AI is much more likely to do to our world of work.

What AI will probably look like

I want you to imagine one of the most prominent figures in the Marvel movie space, Iron Man. Iron Man isn't just Tony Stark in a mechanical suite equipped with weapons and heavy armor. That's a gross oversimplification. The suit is just as much its entity as Tony Stark is. Jarvis, the AI inside the Iron Man suit is just as important as any piece of onboard technology in the armor. Jarvis's job is to support Tony as he fights the bad guys and supplements Tony's abilities when Jarvis would be faster or when Tony is recovering from a particularly hard blow. Jarvis's job is to support Tony Stark; the suit does not supersede Tony, control him, or otherwise displace him. Jarvis responds to Tony's commands, makes the real-time adjustments and anticipates his needs because Jarvis's computational speed is much higher than that of the human mind. Now, this is still very much a sci-fi depiction of AI, but I feel this is more accurate and plausible than The Matrix or The Terminator. If you need another example, think about the computer on Star Trek Voyager. The computer didn't run the ship; it helped Captain Janeway and her crew run the ship more effectively. Note that even the AI behind the Emergency Medical Hologram on Voyager was intended as a stop-gap measure and the crew was continually dealing with the limitations of The Doctor’s technology. It's worth pointing out that the resources required to create artificial people, even virtual ones, are cost-prohibitive relative to the cost of educating and training people who already exist. This is likely to be the case for generations to come since psychologists can't even model consciousness mathematically, which means we can't develop computer technology capable of running the math in the first place. We would have to engineer a whole new generation of computer technology to replicate what the human brain does naturally, but we don't know what to engineer the computers to do to demonstrate self-awareness, initiative, creativity, and forethought. We are in "chicken-and-egg" territory for the foreseeable future when it comes to developing the AI we see as competitors for human beings.

We all need training on how to leverage AI when it starts to creep into our work

The lesson we need to take away from these sci-fi tales is not so much that humans will become obsolete, but that we all need to level-up to leverage these new tools in new and interesting ways. One such skill will be the ability to identify work worth automating in the first place. I’ve noticed that a lot of folks confuse activity with productivity and outputs with accomplishments. Hand a group of business professionals an AI that can automate some of the more repetitive administrative tasks in their business and some will automate work that just eats up processor time whereas others will automate work that increases the profit margin of the company. I anticipate that one of the first places we will see this happen is with the automation of some aspects of the customer experience, especially when it comes to low-level technical support. Indeed, we are already starting to see this happen where AI is automating the retrieval of critical how-to product knowledge for both self-service and human-based customer service.

The days where you could collect a paycheck for doing a set of repetitive tasks will soon be a thing in the past. Instead, you will need to take that next step and start solving the problems related to the customer experience that are touched by your position, regardless of how far removed from the customer you think you are. For example, if you are an analyst whose job was to collate data, program the Excel functions, and produce the reports management need, you're going to need to level-up. AI will do the first part of your job, but not the part where you can add value. Providing actionable insights to the management team is not something the computer will do or do well unless there is a pre-scripted playbook built into the reporting feature. While the grasping the opportunity to grow --and the opportunity it represents--might seem like a no-brainer, I've encountered a lot of professionals whose reaction to this kind of change could be best described as pathological. "That's not my job," is one phrase I've heard a lot of employees say when presented with changes of this magnitude. Our hypothetical analyst has an opportunity to gain more influence, prestige, and job security by adding more value to his position. Instead, he's digging in his heels, refusing to change, projecting skepticism that's uninformed and grounded in fear, and otherwise, behaving in a self-destructive manner that leaves us scratching our heads. Why would this seasoned professional choose this particular hill to die on? I always have to bite my tongue when I hear people say this because my knee-jerk response would be, “Well, job descriptions can be changed even if the mind of the person sitting in that job can't,” but such statements are not helpful. Threatening people never earned true loyalty and there is no way our analyst would see such statement as anything other than a threat to his livelihood. Instead, there’s a better way to introduce this change in a way that encourages compliance, but doesn’t make people feel threatened.

Update job descriptions to include tool usage standards

Updating job descriptions is one of the first steps organizations should make as they start to introduce AI into the workplace. But, this shouldn't be done without employee buy-in. Change Management is often an afterthought when new tech is deployed, and we all know it. We're all guilty of not engaging in change management, but we always have an opportunity to learn from the past, and I hope we do on this one. Giving employees a voice in how the AI could benefit their specific job function will reduce resistance and change panic. By encouraging employees to participate in shaping their future, this gives them more emotional and psychological equity in the process. The need for Change Management becomes more understandable if we don't dismiss our analyst as delusional or impractical when this person has, historically, been a solid performer. We should try and understand this out of character reaction that leaves us feeling shocked and frustrated.

A message to leadership

From this point on, I'm addressing this post to the managers, supervisors, and leaders reading my words. Fear does weird things to people, and I'm of the uncharitable opinion that American society does a poor job of cultivating Emotional Intelligence (EQ) in our population. This lack of EQ makes us particularly easy to manipulate and lie to, and it makes us a pain in the neck to deal with when we are stressed out and afraid of the future. Even the professional do-gooders, advocates, and social justice warriors I know are shockingly bereft of EQ despite their claims to the contrary, the call-out culture of Facebook is a perfect example of this. The same can be said for our hypothetical analyst. Being "the smartest guy in the room" often means we are unaware of our emotions and cannot regulate the more extreme emotions we experience under stress. For our analyst, his reaction is more understandable when we consider his experience of hearing that AI is going to eliminate half of his day-to-day and he doesn't have accurate information on what that means for him. All he knows is data processing and reporting, more in-depth analysis, and executive advising are not something he's had much practice at over the last several years. His fear isn't a resistance to change; his concern is born of a lack of vision for what his position could be moving forward. Should we blame our loyal analyst who's rendered several years of faithful and reliable service? No, because driving the vision for what his job could or should be is the responsibility of his leadership team. In other words, dear reader, it's your job.

Having a vision for what AI will do for your organization is essential

However, dear reader, you may also lack that vision. I bet you struggle to understand the rapidly evolving landscape of your organization, your digital infrastructures often escape your ken and are often a tangled web of systems that grew organically in response to needs, but lacks any strategic forethought. Innovations are coming at you faster than you can figure out how to leverage them. Or if you can leverage them at all, you're also dealing with a million other issues too and this is just another one. You're probably operating on a sleep deficit, you sacrifice time with your family, and aren't getting half the physical activity your doctor is hounding you to increase. You are just as stressed, fearful, and harried as your analyst. When we take a big step back and try to get a broader perspective, you and your employees are in the same boat and always were. Each of you is a person, a person faced with uncertainty in an ever-evolving world, and that's scary. So let's pause for a moment of empathy and ask the most pressing business question in this whole scenario,

What the heck can we do about this?

Learning and development has always been our guiding light in the darkness

Learning and training have been our guardian against barbarism since the fall of the Roman Empire. It's the thing that drives scientific research, social progress, and will ultimately be the very thing that helps future generations deal with the challenges they inherit from past generations. As the current generation brings preliminary AI technology into the workplace, training and development can help us tackle this particular challenge for those willing to learn. Let's return to the example of our hypothetical analyst as a thought exercise about how to manage the change AI represents to a corporate workforce.

Leverage your training department

If you have a learning and development staff --which I assure you very are eager to preserve their job-- offer them the challenge of helping you roll out these changes. This is a very self-serving answer because I know full well a lot of training teams lack the contemporary perspectives on training that will let them tie their work to the bottom line. They often lack the tech skills that will help them leverage their learning ecosystem to its full potential, the program evaluation skills to collect and analyze meaningful data, and don't know what to do to support learning and development outside of offering eLearning or classroom training. However, with the right support, they can help you drive change and innovation across the organization.

Set a vision and focus on that vision

Before we engage the training team, we need to have the vision to act as our guiding star. The onus for forming the vision is on the leadership of any organization. I'm speaking to you, dear reader. It's your job to lead the organization, own it because as you well know, the buck stops with you. This burden of leadership means you need to take the time to understand how your organization uses technology to produce value. You must take the time to understand the technical details of the tools you use to maintain your span of control. You cannot abrogate the need to be an expert in the tools and technology your people use to get the job done; otherwise, you're better off just hiring a GM and checking out. Let's assume that if you are reading this blog post you take your job as a leader seriously and you have a vision for what your analysts will do; you need to communicate that vision in a few key ways carefully. You will need to layout the endstate, and you need to list the steps you see folks taking to get there, you need to state in clear and accessible terms what the organization is doing to support your people as you convince them to join you on this path. Once you lay out the vision, you have to deliver. Your plan has to be aligned with your business goals, with clear indicators you can monitor to ensure your plan is on track. Most importantly, it must show your employees how they will grow with the organization and what's expected of them.

Get the training team involved early and charge them with doing some solid change management work

Your training and delivery team must be engaged with helping you deliver on this vision. However, the training team's involvement should start as you're working on your change management plan. They should help you evaluate your job descriptions and identify the gaps between the vision and the current reality. They can help you redevelop those job descriptions with the support of your HR team and catalog the skills gaps your current workforce is facing. They can help you align training efforts with your key business metrics, they can help you identify the most cost-effective solution, and they can help you identify content libraries that can help reduce the cost of developing training solutions. If your training team isn't up to the task, that's okay, Populouz can help them rise to the challenge.

This example only touches on one person, our hypothetical analyst. But what if you have a team of 60 analysts? What if this team performs a vital business function that, if disrupted, will create no end of havoc in the C-Suite. Now imagine the employee engagement issues you will face if your leadership team announces that AI will transform the organization in ways that are truly exciting to the executives, but is frighteningly unclear to those analysts with families, mortgages, and a mountain student loan debt. If Gallup's 2019 meta-analysis of the American Workforce is any indication of the state of your workforce, your team of 60 analysts will probably suffer disengagement issues. Let's expand this to the group of 200 customer success team members, your 80-member salesforce, your 25-person production team, your eight-person marketing team. These are the people who helped you build your business, and they are the folks who will panic if they don't see how they fit into an organization they imagine will be run by HAL from 2000: A Space Odyssey before they retire. Remember, HAL didn’t turn out to be the best shipmate for the crew.

AI is coming, dear reader, and I hope you're prepared to introduce that change. It will be a big one, but one that doesn't have to cause you headaches. Look at it this way. If you're going to drop $100,000 on a new piece of technology, it's it work an additional 15% to ensure your people embrace the change and you reap the rewards you're trying to achieve? I think it's a cheap price tag for improving the profitability of such a significant investment, but then again, what do I know, I'm just a sci-fi nerd with a decade of training experience.