Learning Analytics in Action
An Interview with Trevor Walker, VP of Organizational L&D at AscensionAdd bookmark
Show me the money.
It’s not just a line from the 1996 Jerry Maguire movie. It’s a request that business leaders have been asking of Learning and Development (L&D) professionals for years regarding the return on investment (ROI) of leadership development programs.
Traditionally, rigorous measurement of the business impact and ROI of leadership development efforts has been time-consuming and often viewed as too little too late to be useful to business leaders.
So how can learning professionals balance the need for data to use for continuous improvement while simultaneously demonstrating how learning contributes to organizational success?
I recently sat down with Trevor Walker, VP of Organizational Development & Learning at Ascension, to talk about how training teams can take their data from volume to value to show the impact of leadership development on business outcomes.
Q: Tell us a little about Ascension.
A: Ascension is the largest healthcare system that you’ve never heard of. We have 128 hospitals in 24 states and the District of Columbia, with 160,000 associates.
We’re a 20-year-old company that started with several congregations coming together as a holding company, then becoming an operating company six years ago.
The de-centralized functions and services of the different congregations were not having the desired impact, nor was it facilitating the generation and generalization of best practices by all our congregations.
So, in the past few years, we came together and asked: What are the services we offer? How do we accelerate the performance of the workforce going forward?
As part of the catholic health system, part of our identity is tied to the value of the “whole person.”
So as we looked at how to develop people, we didn’t just look through a business lens but in an emotional/mental/spiritual way as integral humans. This allowed us to say that additional metrics were important to the business.
Q: Things like spirituality are hard to measure. How does Ascension approach metrics and how have those changed?
A: Historically we’ve had inconsistent practices in tracking data, creating inconsistent results.
We’ve done smiles tests, SurveyMonkeys and post-session surveys (i.e. Did you enjoy it? Were the facilitators good? Was the content meaningful?), but we never extended the assessment and the evaluation into leading metrics that really drove business outcomes.
So we sat down with our executive leadership and told them that we had a learning evaluation strategy and that we wanted to validate our new assessment methodology which we believed would create new business measures–helping everything from retaining and engaging employees in meaningful work to equipping them to make better decisions.
We took a look at our current position and wondered where we wanted to go–to tell a story with our metrics. We used Kirkpatrick’s model as an opportunity to educate both the learner and the associates to see:
- If the quality of our programs supports our organizations’ priorities and if employees knew what those priorities were.
- What the perceived value/their reaction was regarding instructor/environment/course/etc.
- The on the job impact: Could employees take what they learned and apply it?
- If there was anything prohibiting employees from applying what they learned.
- If the content was applicable in driving business results.
We also partnered with metrics that matter, for their perception-based evaluation tool that measured the effectiveness and impact of key leadership development programs on business outcomes and set up a heat map to show all components across markets.
Q: What are some of the results you’ve seen?
A: We’ve come a long way in a short period of time and have had a significant journey in the last year.
We built a business case to show that although this comprehensive L&D overhaul was a significant investment, it also had a significant impact.
We’ve seen great improvements in using standardized reporting at the local and national level. This approach helps identify areas for continuous improvement.
We also partner with our workforce analytics team, who created a massive data lake, to look for correlations of business outcomes against our data sets.
For example, for individuals that had been in a program and rated it highly, could they apply what they learned? How had their quality measures been? Was there turnover?
We can now better evaluate our data and make strategic decisions based on those metrics. We’re encouraging accountability and increasing transparency–if someone isn’t hitting a mark we know it and can make adjustments in real-time.
In another example, from our heat map, we saw how one state was doing incredible across the board, but how another was struggling–so we were able to intervene post-learning. We knew our benchmarks and could break it down by each market.
Our data and analytics were ripe and presenting it through a maturity continuum was incredibly meaningful to bring back to our leaders.
These new measures proved that our content was relevant and impactful, showed where there were barriers (whether there was a lack of support from leaders/environment) and really opened up a different type of conversation.
We were now promoting a learning culture, taking comprehensive methods and strategically employing them.
The measure of any good business case is if the business is willing to renew the contract and we just received the “OK.”
Q: With Ascension having multiple locations and different issues in different locations, how do you approach scaling?
A: This is my biggest concern that keeps me up at night.
What we’ve found through creating a centralized L&D function with centralized content is that some are utilizing it as it was intended/developed and others are choosing to modify or adjust it.
And what we’ve found in the results/metrics is that since everyone gets measured with the same tool, we can start to see where that variation is impacting the effectiveness of the content and the impact on the business.
Metrics that matter helped to standardize the process and scale it. We created a common language so everyone reported on the same things, everyone understood what we were evaluating and why.
Q: What’s the best piece of advice you have for other CLOs trying to get buy-in for learning and who want to take on metrics in a different way?
A: The most important thing learning leaders need to do to get the most value and impact is using the business’ language to visualize the data; It’s key when you’re presenting data in forms and methods in a common vocabulary that the business is already looking at.
Find out what your business is using in terms of metrics and reporting methods, and mirror it as closely as possible so it’s just another metric that they’re tracking as opposed to something you have to educate them on.
Being able to put data side by side with their business data–similarly to how it’s always been displayed–is critical. And our senior leaders actually got excited.
Also using heat maps was really helpful. This was something our company has always used on a regular basis to track quality safety, patient satisfaction and financial performance so leaders already understood what they were seeing, which made a world of difference in communication.