Sign up to get full access to all our latest content, research, and network for everything L&D.

Designing Virtual Learning to Deliver Application and Impact

The 20 Most Powerful Techniques

Add bookmark

With most of learning and development converted to a virtual format, a critical question is now being raised: Is virtual learning working? This question is being asked by executives in organizations, and when they ask it, they aren't concerned if the participants are learning. But rather, they're concerned about participants using what they've learned and if it's having an impact on the organization. Executives want virtual learning to drive impact and specific business measures.

Almost a decade ago, Chad Udel suggested that virtual learning should drive important business results such as those shown in Figure 1.1 Unfortunately, most virtual learning is not connected to them.

“Virtual learning will be a very significant part of L&D after the pandemic . . but only if it works.”

Figure 1. High-Level Business Benefits from Virtual Learning

Decreased Product Returns                          
Increased Productivity           
Increased Accuracy
Fewer Mistakes                                             
Reduced Rise                                               
Increased Sales                                           
Less Waste                                                   
Fewer Accidents                                           
Fewer Compliance Discrepancies                   

Reduced Incidents
Decreased Defects
Increased Shipments
On-Time Shipments
Decreased Cycle Time
Less Downtime
Reduced Operating Cost
Fewer Customer Complaints
Reduced Response Time to Customers


The Chain of Value Is Always There

It's important to understand how success is achieved with virtual learning. As presented in Figure 2, success follows a chain of value, a classic logic model which forms the basis of most evaluation models, including the ROI Methodology®.2

Figure 2. The Value Chain for Virtual Programs

Level Measurement Focus Typical Measures

0 - Input

               

Input into programs, including indicators representing scope, volumes, times, costs and efficiencies
  • Types of programs
  • Number of programs
  • Number of people involved
  • Hours of involvement
  • Costs

1 - Reaction and Planned Action

               arrowlzkjCK9QzxGFsh98XYcVJLW09TDJoJTsLijNBQwb

Reaction to the programs, including participants' perceived value and planned action to make them successful
  • Relevance
  • Importance
  • Usefulness
  • Appropriateness
  • Intent to use
  • Motivational
  • Recommended to others

2 - Learning

               arrowlzkjCK9QzxGFsh98XYcVJLW09TDJoJTsLijNBQwb

Knowledge and success gained, learning how to develop concepts and how to use skills and competencies to drive program success
  • Skills
  • Learning
  • Knowledge
  • Capacity
  • Competencies
  • Confidences
  • Contacts

3 - Application and Implementation

               arrowlzkjCK9QzxGFsh98XYcVJLW09TDJoJTsLijNBQwb

Application and use of knowledge, skills, and competencies, including progress made and implementation success
  • Behaviors
  • Extent of use
  • Task completion
  • Frequency of use
  • Actions completed
  • Success with use
  • Barriers to use
  • Enablers to use
  • Engagement

4 - Impact

               arrowlzkjCK9QzxGFsh98XYcVJLW09TDJoJTsLijNBQwb

The impact of the programs and processes expressed as business impact measures
  • Productivity
  • Revenue
  • Quality
  • Innovation
  • Graduation rates
  • Crime rates
  • Jobs created
  • Efficiency
  • Incidents of disease
  • Retention
  • Customer satisfaction
5 - ROI  
  • Benefit-Cost Ratio (BCR)
  • ROI (%)
  • Payback Period


Why Virtual Learning Fails: A Quick Review

We've had the opportunity to evaluate virtual learning for many years, and have compared instructor-led learning with eLearning, sometimes in parallel, to see the differences. We've found that virtual learning typically breaks down at Level 3 (Application) and Level 4 (Impact). Why does this happen? Here are four key reasons:

Issue Status

1. Multitasking inhibits learning.

The research shows that multitasking reduces a participant’s ability to learn.3

There is a myth that a person can multitask and still absorb in-depth knowledge and information. That doesn’t happen.

With instructor-led learning, multitasking is better controlled in many ways. As we've all witnessed multiple times with virtual learning in virtual conferences, meetings and online sessions, multitasking runs rampant. The reality is if learning is diminished, then the application will be diminished, and impact will be diminished. This will decrease the ROI significantly.

This is a serious problem that is being addressed by instructional designers and technology companies alike.

2. Manager's support is usually missing.

Participants typically leave their work areas to attend instructor-led learning programs. Their managers know they're involved, and the manager is likely to be involved in the decision for them to participate in the session. The manager usually creates expectations for the program. The managers’ follow-up on the learning translates into something useful and worthwhile for the department.

For virtual learning, the manager is usually not involved. The manager may not even know that participants are involved. Without the manager’s presence, the most significant influencer for transferring learning to the job is removed.

This is a serious issue. There isn't enough effort on this issue. Instructional designers need to build in activities that will draw the manager into the process.

3. Virtual programs are designed for learning, not application and impact.

Instructional systems designers design virtual programs to deliver learning. Most current virtual design books focus on designing for learning—not application and impact.

Designers think their work is complete when the participant has learned knowledge or skills. However, executives who provide the program’s budget, want to see the business connection from participants actually using the learning.

For example, very few virtual learning programs have impact objectives. Without impact objectives, key stakeholders may not fully understand why the program is being implemented. When impact objectives are in place, the team can design the program for application and the desired business impact.

Designing for impact is more likely to occur in an instructor-led program. This is our best opportunity for improvement as instructional system designers need to take steps to design for application and impact before, during and after the program is implemented.

4. Technology challenges.

Technology failures and connection problems appear in the best of organizations and in the best programs, as we all have experienced. Add to this problem the proliferation of inexperienced users that create havoc and bring challenges to make the program seamless. These technology failures rarely occur (or can easily be managed) in an in-person program.

The good news is that instructional designers are working on this along with technology providers. Still, there is much to be done in this area.


Designing Virtual Learning for Impact and ROI

Items number two and three above can be tackled with a serious approach to designing virtual learning for impact and ROI. In the last two decades, we've been involved in the evaluation of many virtual learning programs including online learning, eLearning, simulations or mobile learning. We've had the opportunity to see what works and what doesn't.4

In every evaluation, we collect data on barriers and enablers. The barriers are the things that get in the way of the participant having success with the application. The enablers are the factors that help achieve success with the program. As you can imagine, these enablers present tremendous opportunities to see what works to enhance application and impact.

The barriers tell us what we need to remove or minimize to have success and create an opportunity to design for success in the future. The enablers provide us with proven techniques that have already actually worked and proof that they made a difference in application and impact.

Figure 3 presents 20 powerful design techniques that can enhance the application and impact of virtual learning programs. Remember, from an organizational perspective, if application doesn't occur, the virtual program is a waste. Some people refer to this as scrap learning (learning that is not used).

Figure 3. Twenty Techniques to Design Virtual Learning to Deliver Application and Impact

Actions Before the Program
1. Have the manager create expectations before the program.
2. Develop application objectives for participants and other stakeholders.
3. Have participants develop customized impact objectives and share with other stakeholders.
4. Use performance contracts, a contract for application and impact between live facilitator, participant and manager.
5. Create an application guide to enable and support application.
6. Create a job aid to assist in application.

Actions During the Program
7. Use action plans to detail application steps.
8. Teach to application and impact objectives.
9. In the last learning module, review data collection needed for application and impact.

Technology-Enabled Actions
10. Create coaching videos to use at appropriate times to support use.
11. Use apps or software support to encourage and enable use of content.
12. Use WhatsApp (or other networking platforms) to provide encouragement, support and enablement.
13. Post recorded content reviews as a reminder just before use.
14. Use automated reminders for application and data collection.

Actions After the Program
15. Have managers provide support and encouragement for application and impact success.
16. Organize a coaching session after the program to focus on application and impact.
17. Collect completed action plans in a follow-up period and send results to the entire group.
18. Organize follow-up sessions to share results and enablers, and tackle barriers.
19. Host a lessons-learned meeting after participants have used the content.
20. Share early successes with other participants to nudge them to use the content.


The definition of the success of learning has shifted: It's not when learning has occurred, but when learning is used and has an impact. This new definition of success represents a mind shift for many L&D stakeholders.

When it comes to delivering results from virtual learning, hope is not a strategy; luck is not a factor; doing nothing is not an option. The accountability for virtual learning has shifted. Change is inevitable; progress is optional. It’s up to each of us to make sure virtual learning delivers the desired results. For more details on the 20 techniques, please contact us.


References

1. Udel, Chad. (2012) Learning Everywhere: How Mobile Content Strategies Are Transforming Training. Nashville, TN: Rockbench Publishing and Alexandria: VA, ASTD Press, 2012.

2. Elkeles, Tamar, Patricia Pulliam Phillips, and Jack J. Phillips. Measuring the Success of Learning Through Technology: A Step-by-Step Guide for Measuring the Impact and ROI on eLearning, Blended Learning, and Mobile Learning. Alexandria, VA: ATD Press, 2014.

3. Zack, Devora. Singletasking: Getting More Done—One Thing at a Time. Oakland, CA: Berrett- Koehler Publishers, 2015.

4. Elkeles, Tamar, Patricia Pulliam Phillips, and Jack J. Phillips. Measuring the Success of Learning Through Technology: A Step-by-Step Guide for Measuring the Impact and ROI on eLearning, Blended Learning, and Mobile Learning. Alexandria, VA: ATD Press, 2014.

This article was originally published on ChiefLearningOfficer.com and is currently published on The ROI Institute's website here


RECOMMENDED