Interview with Christopher Lind: The Unseen Tradeoffs of Leveraging AI
Add bookmark
What are some of your biggest concerns with AI contributing to cognitive decay? And what does it mean for L&D specifically?
We, as a species, like shortcuts, the easy path; we like comfort and complacency. And AI offers that. You won’t notice at first that your cognitive skills are decaying, because it’ll come slowly, and you won’t see it happening. But some people who have reached out for coaching, they’ve caught it in themselves already. All of a sudden, they went to do a really basic task and struggled. One of the more nefarious concerns for me is how these models are continuing to advance, and some of the risks associated with them as they become more powerful. They’re built on human data, they’re trained by humans, and they’re reinforced by humans.
More and more junior-level tasks are being automated through AI. What does this mean for entry-level jobs?
This is probably one of the biggest threats that I see with AI. AI promises more, and faster. If we continue to use AI this way, it becomes exponentially more problematic. Seasoned employees are going to retire or leave at some point, and will anyone know what they did, or how to do it? We like to think AI is actually capturing everything and that it knows everything, but it doesn’t. It just sees a lot of data. This is really problematic for junior people, because if you don’t break that cycle, it won’t be noticeable until all of a sudden those people are gone and you’ve actually lost a lot of that IP. We need to make sure our more seasoned people are actually committing time to working side by side with our more junior people, and that can be alongside AI.
Right now, all the buzz is about personalized and adaptive learning paths and experiences. What are some of the unseen tradeoffs of doing that?
We waste people’s time on information they don’t need. So, I think adaptive learning can be super helpful in streamlining that. I think one unseen tradeoff is that struggle is part of learning. Failure is part of learning – having to wrestle through something and not just being given the answer. And that’s part of how you grow. We have to tweak our adaptive learning models to pick up on not just what people may be looking to learn, but what they could benefit from.
What are some of the biggest issues that we need to solve regarding leadership development?
One of the biggest gaps is that most organizations have not really defined what good leadership looks like. Additionally, the gaps we had with leadership capabilities are only going to be widened with AI. I wrote about it – the leadership trust crisis is here. For the first time in documented history, people trust their managers less than the C-Suite.
That’s a big problem. People need their boss to be a coach, a confidant, and somebody who can talk through issues with them with emotional intelligence.