I talked last week about the first approach for measuring ROI, a Soft ROI. We focused on three key areas:
- Measures (Evaluation questions focused on impact and effectiveness; Activity data)
- Telling the Story (Provide context; Focus on impact; Don't fear the red)
- Challenges (Critical to start early; Lack of business data)
Inferred ROI leverages both evaluation data and some business data. This wont’ be as extensive as a full study, but it should contain data to infer impact. We know there are a number of variables that impact any business outcome, so we’re not telling a story of causation, rather of inferred impact. Let's look at the biggest things to tackle with an Inferred ROI - Measures.
First and foremost, the measures that apply to Soft ROI also apply here with one exception. Cases where you will have business data for a measure, you may not need a proxy question on the evaluation. You will still need to be able to show that the training solution was effective; otherwise, it can diminish the perceived impact that L&D may have had on the outcomes. From a business data standpoint, you need to think about how you want to infer impact.
Trending is a common way to show differences in results, wrapped around a specific training intervention. This is really ideal for programs with a short deployment window. Essentially, in the trend you should see improvements starting a reasonable time after training.
Comparison Groups (also called control groups) can be a very effective way to infer impact and it's really easy! Take your business data and divide into two groups - one group should be those that were trained in the program, and the other group should be everyone else who are in similar type roles but were not trained. It can be as simple as two bar graphs comparing the results.
Prior Period or Prior Results is also a feasible option, but a little harder to infer. Think through what might be possible based on the data available. For example, if you don't have an easy way to accomplish either of the options above, are results available from prior years, cohorts, etc. to show the results against? Again, this doesn't need to be a hard correlation, we're just focusing on the connection between the program and the results.
Best Practices. There are some general considerations and recommendations when reporting on this data.
Assumptions are OK, just document them. Don't be afraid to make assumptions as part of the process. It happens throughout your business, all the time. Own your story and have those assumptions ready. If leaders ask about them, then great you have them available. What if they ask you to change them? Even better! This means that they are engaged in the process and now you have an assumption to work with that the leader is also bought into.
Keep it simple! Sounds pretty obvious, but I've seen some slides and visuals that would make your head spin. I'll reinforce this point from a prior post; focus on the story. Your best friend is an Appendix! If there is additional information you think may be important to know, don't try to force it. Have additional views or supplemental information ready to go. You can even use the notes to reference the slide number in the Appendix.
Questions are good. The first time you try to do this, it won't be perfect. Use each opportunity to learn and improve what you do the next time. Remember though, even if you're getting grilled by a leader - that's a good thing! It means they are engaged in the topic, they are thinking about what you're presenting, and they care enough to ask those questions.
Next week will be our last post on this topic where we'll discuss a Full ROI Study. Until then, everyone be well.Happy Measuring!Chris LeBrunDirector, Professional Serviceschristopher.firstname.lastname@example.org