Sign up to get full access to all our latest content, research, and network for everything L&D.

Charles Jennings on the Change in Focus From Training to Performance

Add bookmark

In the last decade, many learning and development leaders moved from viewing themselves as individuals who simply extol training to facilitators who work to improve employee performance.

In the last ten years, more organizations have begun to follow the 70:20:10 model, where workplace learning—either on-the-job troubleshooting or assistance from colleagues—is acknowledged as being a larger driver of learning than traditional instructor-led training.

Charles Jennings, founder of the 70:20:10 Forum, pioneered the implementation of the model as part of a learning organization while he was the chief learning officer at Reuters in the early 2000s.

With tightened budgets and increased pressure to show ROI, L&D leaders are also being asked to quantify the training that has occurred. But Jennings cautions that many of the methods we have to measure learning comprehension do not truly measure learning, which he defines as "a change in behavior."

"We see it a lot in e-learning, where people undertake a pre-test and a post-test and the assumption is learning has occurred," Jennings said. "And that's not the case. What's occurred there is short-term memory retention." In order to really measure a change in employee behavior, L&D leaders need to become strong observers.

View our full video interview with Charles Jennings above, or check out the text version of our Q&A below.

When you implemented the 70:20:10 model as the CLO of Reuters in the early 2000s, most companies at the time thought of training as something that needed to occur in an instructor-led classroom. Can you talk about how those conventions have changed among L&D leaders in the last decade or so?

I think there's been a huge change. First of all, I think we understand that learning is not necessarily training—training is one aspect of learning, one aspect of development.

In fact, I always said when I was chief learning officer, I said, despite my title, I'm not really terribly focused on learning. I guess I'm focused on learning, but it doesn't really light my fire, but I'm passionate about performance. I think that's one of the things that has changed over the last 10 years.

We've moved this focus from thinking that learning is training or schooling to learning being linked to performance. So we're looking from process to outcomes, and certainly when I took over my role at Reuters and we decided to implement 70:20:10, it helped us take a step back and think about how people learn and the best ways, not just the efficiencies, but also the effectiveness of learning.

That’s one of the major changes that I've seen in the last 10-15 years: There's not always this knee jerk reaction, although there is still unfortunately often a knee jerk training reaction when managers will say, I've got a training problem, to which I've always said, training is not a problem. Training is one of the solutions, one of the potential solutions that might be brought to address a problem.

The rise of informal learning has emerged in concurrence with this popularization of the 70:20:10 model. I know learning leaders often talk about how informal learning is really hard to measure and quantify. With this greater emphasis on measuring learning outcomes because of tighter budgets and coupling that with more recognition of the role that informal learning plays, how should organizations go about quantifying the effects of these softer forms of learning?

That's a question that I often get asked, and I think it's actually a misunderstanding. First of all, I don't like the term informal learning. It implies that it's just happenchance and that there's no way in which we can be much more focused around it. But I like to use the terms workplace learning, learning in the workflow, or social and workplace learning.

To your point around measuring it, I would argue that it's just as easy, in fact often easier, to measure the real learning that occurs in workplace and social and informal learning than it is to measure structured learning because what's measured in structured teaching and learning and training often has nothing to do with learning.

If we look at the standard ways in which we measure structured learning, we do the happy sheets—we measure the reaction, we use the first levels of Don Kirkpatrick's model.

So they measure Kirkpatrick 1: the reaction. That tells you whether the experience was good. One could be cynical and say it tells you whether you had a nice lunch and whether the person standing at the front was a great actor. But it may be a little more than that.

The second level, which is taken as learning, is often assessed by some sort of post-class test. We see it a lot in e-learning, where people undertake a pre-test and a post-test and the assumption is learning has occurred. And that's not the case. What's occurred there is short-term memory retention. So, if we're assessing real learning and by learning, I mean behavior change—that’s how I would describe it.

If you don't change your behavior, you don't do things differently than you did the before, you haven't learned. So, if we're going to measure real learning (i.e. behavior change), the only way to do it is to observe people in action—measuring them in terms of how they're doing their jobs differently, whether they're able to carry out tasks, whether they're able to do things that they couldn't do before.

I always say that adults learn in four essential ways: we learn through experience, practice, conversations and reflections. It doesn't really matter whether those things are happening in a structured way or an unstructured way.

I would actually counter the argument that it's more difficult to measure informal learning to say, you need to measure the right stuff. If you're measuring the right stuff, it's not terribly important how people have got to where they need to get to.

Keeping on this ROI and measurement bend, I know at Corporate Learning Week Europe, you'll be leading a workshop that's titled "Analyzing Your L&D Expenditure: Eliminating Waste and Allocating Resources for What Really Counts." As we've mentioned, there’s been a tightening of budgets in our post-global recession world, especially for learning and development. What are some of the steps companies should take to analyze their learning and development spend to take away those wasteful practices?

One of the things I'll address in that workshop is really looking at how we commit our resources and how we address the problems that we have in terms of learning and development professionals and how we can assess both the efficiency and the effectiveness. That efficiency is extremely important and even more important in recent years.

I think there's a lot of what I'd call busy work that occurs in learning and development departments. We don't sit back and reflect about how efficient our processes are, whether we're doing the right thing to get to where we need to get in the most efficient ways.

Working with so many organizations using the 70:20:10 framework, I've not found one that it hasn't reduced their cost base. In fact, I've seen organizations where the cost base of learning and development has been reduced more than 50 percent.

In terms of thinking about eliminating waste, thinking about the most efficient ways in which we can build and support capabilities in our organizations and aligning that with the most effective ways we can do it -- it's a matter of balancing those things out.

You'll be doing a keynote—the title of that is "70:20:10 and Its Implications for the Future of Learning and Your Organization." We're discussing different things L&D leaders can do in implementing this strategy and model. From the perspective of L&D leaders, how does their role change when they implement the 70:20:10 business model?

70:20:10 makes the learning leaders’ role larger. It also has demands in terms of capability and skills. It demands you to step out of the model of a learning professional being a designer and a developer and a deliverer of learning content. It steps away from carrying out, for example, trends and need analysis to carrying out performance analysis to understanding problems and using processes to solve problems. It actually extends the role significantly. It makes it more strategic. It makes it potentially more impactful, but it also means we require different sets of skills.

We still require those skills that we've always had in terms of designing, developing and delivering really good, tight and impactful training and structured learning. But, for example, I can't see how any learning professional now really can do their job properly without understanding social media, not to the extent of being an expert in social media, but understanding the potential and when it can be applied.

I can't imagine any learning professional doing their job properly without really understanding both consulting, in terms of performance consulting, but also understanding their stakeholders.

In my experience, most stakeholders, whether you work in a commercial organization or you work in a government department or you work in a third sector organization, you'll be driven by numbers. You'll be driven by your P&L. You'l be driven by the financials, and I often say, if a learning and development professional cannot read a balance sheet or a profit and loss analysis, they should really go away and learn that pretty quick.

I am often asked whether 70:20:10 is actually reducing the value of learning professionals because they say, hey, you're telling us our only value is only that small bit. I usually don't like to talk the numbers because it's not about the numbers, but it's only that 10.

The answer is, well, sure, the evidence seems to show that the structured development in terms of impact, when we look a high performers, structured development is important, it's very important when you start out, it's very important when you're new to the organization or new to a role, less important once you're a veteran in an organization, in a role. But that piece of the L&D role is relatively small.

The opportunity is to look beyond over the wall into that other 90 and all the other things that can be done, so I would turn it around and say that for smart learning and development leaders and for chief learning officers and their teams, actually their are huge opportunities here.

A future, forward-looking question for you: I know it's hard to predict the future, but as you've mentioned, in the last decade there has been this dramatic shift in how L&D leaders see their roles—from this person who is just extolling training to someone who really is improving individual's performances. Looking to the next decade: How do you see the L&D field continuing to change and evolve?

As you say, making predictions is fraught with problems.

First of all, we know it's going to change. There is no doubt that there will be change. The question is what that change will be.

We've already seen the moving from training and schooling to learning. We're going to see the role moving from learning to performance. We're going to see the role becoming much more strategic in terms of not just supporting performance, but supporting productivity and supporting key objectives and much more close linkage with organization objectives.

It's going to be much richer, not just in terms of what we've been talking about in technology, but richer in terms of the solutions we can help carve and help bring and support our stakeholders. I'm very positive about it.

I've seen huge change. I've been working in learning and development for almost 40 years now, and I've seen lots and lots of changes occur and I think the next 5 years we'll probably see as many changes as I certainly have seen in the last 40.


RECOMMENDED