September Meeting Recap - Measuring Learning Programs

Wed, September 25, 2013 9:15 PM | Suzanne Choma

By Christine Herrman, MS, PHR

“Measuring Strategic, Visible and Costly Programs” was the focus of GV ASTD’s September 12 meeting, which took place in Bad Fish Consulting’s conference space at the Village Gate. TJ Kupchick from Chicago based Knowledge Advisors delivered the presentation via webinar, while Bob Peters provided the technical interface and facilitated questions and discussion with the live audience in the conference room.

KnowledgeAdvisors is a leader in learning and talent analytics. Metrics that Matter is a learning analytics system offered by KnowledgeAdvisors which helps organizations measure and improve talent development.

At the beginning of the webinar, TJ gave us staggering figures showing that an exorbitant amount of money is wasted each year on “scrap learning” – learning that is not able to be applied on the job.

 KnowledgeAdvisors suggests asking two questions when evaluating learning to ensure it is not “scrap”:

  1.  Does the training lead to increased performance? and
  2.  Can we improve the training experience?
By collecting and analyzing data that answers those questions, organizations can cut or improve programs that aren’t providing an impact.

TJ discussed the five level learning evaluation framework familiar to most learning and development professionals:

  • Level 0 – Activity, Cost
  • Level 1 – Satisfaction with Learning
  • Level 2 – Learning Effectiveness
  • Level 3 – Job Impact
  • Level 4 – Business Results
  • Level 5 – Return On Investment (ROI)

Research shows that Levels 0, 1 and 2 are not considered important to senior management; the most important data to senior management regarding learning are Levels 3 through 5. Most organizations do not have the right systems in place to evaluate this data. Surveys can only go so far – we have to find ways to measure results on the job and in the overall business.

We can begin moving in the right direction by asking Levels 3 through 5 questions on our surveys to turn “smile sheets” into “smart sheets”. Some examples are:

  • Level 3: What percentage of new knowledge and skills learned from this training do you estimate will directly apply to your job?
  • Level 4: This training will have a significant impact on which of the following: increasing quality, increasing productivity, increasing employee satisfaction, decreasing costs, increasing sales, increasing customer satisfaction?
  • Level 5: What about this class was most useful to you? Provide a tangible example of how you will apply it.

As a group we discussed the need for follow-up two, three, four months down the road. A question to follow up with would be “Were you able to apply your learning within the first six weeks – why or why not?” One of our colleagues suggested following her model in using a “learning log” in each module, which would ask learners to identify the most memorable and valuable items they had gleaned from that session.

We talked about the lack of support from management in application of learning and the follow-up, and how important the manager is in this process. One case study TJ shared with the group to help with this is to follow the model of Prep Up, Step Up, Follow Up. Managers should build understanding of expectations and create goals with learners before the learning event, as well as follow up on these with the learners in order to achieve tangible results. Creating accountability around learning impact is paramount. Some tools from this case study were a pre-training impact plan completed with manager/learner, followed with a discussion forum to encourage ongoing interaction with manager/learner and then an application opportunity identification worksheet completed with manager/learner as follow-up.

One of the key takeaways from this discussion to me was that measurement doesn’t begin after training – there has to be an evaluation plan, just as much as there has to be an implementation plan for the learning. If you don’t know how you’re going to measure what you’re trying to improve, then how will you know if improvement has been made? What’s the point?

Thank you to Christine Herrman for writing up this program summary. Christine is a Learning Specialist at ABVI and Goodwill of the Finger Lakes.

Thank you to Badfish Consulting for for allowing us to use its very cool meeting space at the Village Gate.

admin@rocatd.org
© Rochester Chapter, Association for Talent Development
Powered by Wild Apricot Membership Software