A practical guide to implementing the 5As Framework for making any learning and change intervention in organizations successful.
5As Framework for Increasing Impact of Training Sean Murray interviews Steve Gill about the 5As Framework for achieving business impact from training. This is a 45 minute Webinar with several audience polls and responses to chat room questions from audience.
It’s that time of year again, when we pay special attention to our personal and career goals. However, how likely is it that we will actually achieve those goals? I belong to a fitness center at the local community college and I’m always fascinated by the January upsurge in activity and then the fall off around March and April each year. Hopefully, that means some people have achieved their fitness and weight loss goals, but I’m afraid that for many it means they have given up. These folks probably haven’t established the processes and relationships that will help them achieve their goals.
Jesse Lyn Stoner makes some excellent suggestions for what we can do in terms of processes and relationships to ensure that we achieve our New Year’s resolutions. You can apply these same principles to organizational learning goals. For individuals, teams, and whole organizations to achieve their learning goals, they need to develop supportive processes and relationships. It’s not enough to identify the knowledge and skills that you want to develop. And it’s not enough to select a method for learning. To apply and sustain that learning, you must also establish processes and relationships that support learning and application of that learning.
Listed below are Stoner’s six suggestions and my explanation of how each one helps in the attainment of learning goals.
1. Start with your current goals.
What new knowledge and skills are you trying to acquire? How has that been going and what can you learn about your own process of learning? Decide if you want to continue to acquire that knowledge and those skills or are you ready to move on to something else. For example, maybe you want to learn how to give negative feedback to your direct reports. Take stock of your ability to do that and whether you need to continue working on that skill and then identify additional learning goals that you need to achieve.
2. Connect your goals to a larger purpose.
Align your goals with the strategic goals of the organization. Be clear with yourself, with co-workers, and with your manager about how acquiring certain knowledge and skills will contribute to the organization’s success. Have a “line of sight” from the learning goal to the performance of the organization. For example, be clear about how learning Lean/Six Sigma will help the organization be successful.
3. Goal setting is not always a logical process.
Don’t get frustrated by the lack of a straight-line process. You might set some goals and then, in talking with your boss and after some experience, decide that those goals need to be modified. And given the pace of change, a learning goal that you set today could be irrelevant tomorrow. However, whatever the goal, have some notion of how you and your boss will know that it has been achieved.
4. Write your goals down and put them somewhere visible.
Writing them down will help you commit to achieving your goals. Keeping them visible will remind you that this is your task and also allow you to modify the goals as needed. The adage, “out of sight; out of mind”, applies here.
5. Don’t keep your goals a secret.
Discuss your goals with your boss and co-workers. You need their support. For example, if you are learning how to run a more effective and efficient team meeting, you need the cooperation of your team members and their feedback. Your boss should be able to provide you with opportunities to practice these team management skills and advise you on what you need to learn and how best to learn it.
6. Set up processes and practices that support your goals.
You’re more likely to successfully achieve your learning goals if you hold yourself accountable and if others hold you accountable. Discuss the indicators of successful learning with your boss and co-workers. Arrange times to regularly check in with them to take stock of your progress.
Organizational learning is not something you can do in isolation. You can identify learning goals but you will need the support and involvement of bosses and co-workers to achieve those goals. As Stoner recommends, establish those processes and relationships at the outset and you will be more likely to follow-through and be successful.
The phrase, "what gets measured gets done," has become a rallying cry for trainers and evaluators. We use this to justify our work and convince CEOs that they should invest in performance measurement. However, as I've argued in previous posts, the saying is not always true and, in fact, is misleading.
One implication of the phrase is that if you measure something (customer service, productivity, sales, revenue, etc.), people will pay attention to what is being measured and do what they can to improve those outcomes. Many examples refute this logic. GM measures quality of every part and every car yet still has recalled 29 million vehicles so far this year. The Veterans Health Administration measures patient waiting time yet still is under congressional scrutiny for wait times that were much too long. Lehman Brothers, once one of the largest investment banks in the U.S., constantly measured the performance of the securities it owned and managed, yet still had to declare bankruptcy in 2008.
In each of those cases, it appears that key stakeholders had the data but did not use the data to make their decisions. It’s as if they were trying to fulfill a compliance requirement without a commitment to improvement. Or they didn’t want to know because that would mean they would have to change something. Measurement alone is not sufficient; it’s the application of those results to decision-making that gets things done.
A variation on "what gets measured gets done," is, “what you measure is what you get.” To me, this saying has a slightly different meaning. This is more about the importance of choosing the right measure for the situation so that you are reinforcing the intended behavior and not something that you don’t want. I once consulted with a state Blue Cross Blue Shield office that proclaimed their commitment to customer service but evaluated customer service reps on the basis of how many calls they handled each hour. Number of calls handled went up; customer service went down.
Begin with the end in mind: who is the target audience, what do they need to do, how do we measure whether they are the ones accessing the program, and how do we measure their performance?
So: When looking for measures, try to find things that are meaningful, that give you real information to help real people do their jobs and to help organizations perform more efficiently. Beware of easy measures and vanity metrics.
Good advice! The tendency so often is to look for the lost key under the streetlamp because that’s where the light is. Measures are chosen because “we’ve always done it that way” or because “that’s what we know how to measure” or “that’s what everyone else does.” As Bozarth suggests, decide on what behavior you want and then decide on the best way to measure that behavior. In that way, you’re more likely to get the data and results that you need.
However, here too, the phrase has limits. What you measure is not always what you get. Many organizational factors can intervene. Maybe you are measuring the right things in the best way, but managers don’t value those outcomes, or the findings are not communicated to the stakeholders, or intervening events and unintended consequences are not factored into the results. Again, it’s not measurement per se, but what is done with those measures that makes the difference.
Transfer of learning to the workplace continues to be a vexing problem for organizations. Twenty-two years ago Mary Broad and John Newstrom published their landmark book, Transfer of Training, in which they argued that not much training gets applied in the workplace.
Sadly, not much has changed since then. This is confirmed by a Linkedin discussion started by Charles Henderson in summer of 2013 that shows that transfer of learning (e.g., classroom, elearning) is still a huge challenge for trainers and other learning professionals.
As I mentioned in a previous post, Henderson asked members of the Linkedin “Learning, Education and Training Professionals Group” to answer this question:
IN 10 WORDS or LESS...Why do you think learners forget what they've learned so quickly?
It's probably safe to say that we've all experienced the letdown of discovering that one of our learners has forgotten some or most of what we helped them learn during training. IN 10 WORDS OR LESS...what's your take on the MAIN reason why this happens?
To date, Henderson has received over 900 comments to that question which he has catagorized into two types: 1) comments about the timing of the problem (intra-training, post-training, not related to training, or pre-training); and 2) comments about what or who is responsible for fixing the problem (curriculum, leadership/accountability, learners, facilitators). An analysis of these comments suggests that a large percentage of learning professionals believe that low retention of learning is primarily due to the relevance of course content and how that content is delivered.
A couple of things fascinate me about that Linkedin discussion. One is that nobody questioned the assertion that “…learners forget what they’ve learned so quickly.” Apparently, this is widely experienced by trainers, even with all that has been written and said over the past 22 years about how to ensure the transfer of learning.
Another thing that fascinates me about the responses to Henderson’s question is the lack of a systems view of learning and change. I think the answer to Henderson’s question has to be “all of the above”. Retention of learning is affected by what happens before, during, and after training and everyone is responsible for learners learning. Every employee has a stake in whether another employee remembers what they learned and applies that learning to help the organization be successful. Managers of learners, in particular, have an important role to play in helping employees remember and apply learning.
Call me naïve, but I’m always surprised at how little progress we've made in ensuring the transfer of learning. While the quality of instructional technology and the quality of instruction has improved tremendously in the past two decades, application of that learning in the workplace still does not happen often enough.
The field of training and development lost one of the great ones on May 9th when Donald L. Kirkpatrick died. Over the past 50 years, he was one of the most influential thought-leaders in the area of evaluation of employee training. Through his speaking and writing he taught several generations of training managers and instructional designers how to assess the value of training.
It seems like every training, HRD, and HPI manager knows the Kirkpatrick Model even if they don’t know the name of the model or who invented the four levels. They know they can evaluate reaction to training, learning from training, behavior change in the workplace, and results for the organization.
I have been critical of the model in the past, primarily because I think it gives trainers permission to evaluate programs by collecting “reaction” data from participants when, in fact, that data doesn’t tell us anything about the value of the program to the organization. Even assessment of “learning” doesn’t tell us much. It’s not until we investigate how and why that learning is applied, what happens because of the application of that learning, and what other factors are affecting learning and its application, that we begin to understand the impact of any learning intervention.
However, this criticism doesn’t diminish my admiration for Kirkpatrick. I always had tremendous respect for the man who simply wanted to help individuals and organizations become more effective. He wrote:
Ultimately though, when people think about Kirkpatrick, I don’t want them to think about me; I want them to think about the model and the mission to ensure that training contributes to organizational results. I hope that my model helps to improve training and follow-up so that the lives of those to be impacted by organizations – citizens, customers, patients, clients, children and families – ultimately benefit in some way.
Donald Kirkpatrick’s son, Jim, and daughter-in-law Wendy, have taken up the mantle and added substantially to the Model, filling in some of the blanks that weren’t addressed in the original four levels. I applaud them for doing this and for keeping the importance of evaluation alive in the minds of training managers and leaders in organizations.
We all owe much to the contribution Donald Kirkpatrick made to employee learning and organizational improvement.
You don’t get as much value as you should out of your organization’s training and development programs In fact, the number of trainees who apply new learning in their organizations is estimated to be only about 15% to 20%. That is a sad state of affairs. The 5As Framework is a solution to this problem.
The 5As Framework is an easy to remember aid for ensuring that any learning intervention, whether classroom training, elearning, coaching and mentoring, self-directed study, internships, etc., results in participants applying what they have learned in their organizations. The Framework was created by Sean Murray and myself to help trainers and managers get more out of their investment in training and development programs.
Interest in the 5As Framework has increased recently. In part, this is because of an online course I am facilitating for ASTD titled, Developing an Organizational Learning Culture, and because of a presentation I gave in the ZingTrain Speaker Series. I want interested readers of this blog to easily find information about the 5As Framework. Therefore, I am summarizing information in this post.
According to the 2013 State of the Industry report from ASTD, instructor-led programs continue to be the primary method for training and developing employees. The researchers estimate that 70% of training in companies is instructor-led. The content of these programs runs the gamut from basic skills to executive development.
Although classroom-based training has declined over the past few years and some of the instructor-led training is done electronically (i.e., elearning), the predominant method is still sage-on-the-stage. Given that only 15% to 20% of the participants in these programs will end up applying their learning to achieving the strategic goals of their organizations and ASTD estimates $164.2 billion was spent on training in 2013, a rough estimate of training dollars wasted in 2013 is $92 billion (.80 X [.70 X $164.2 billion]).
EQMentor lists seven reasons for the failure of traditional, instructor-led training and why alternative methods, such as mentoring and coaching, are needed. To paraphrase EQMentor:
Emotional intelligence, one of the keys to personal and professional success, can’t be developed in a one-time program. It must be developed over time with the encouragement and assistance of others.
In order for training to be useful, it must occur close in time to when it is needed. Scheduled training programs can’t be timely.
Instructors, alone, can’t ensure that knowledge transfer occurs. Even if learning occurs during the course, that is no guarantee that learning witll be applied on the job.
Today’s highly complex organizations, with their shifting customer demands and competitive pressures from around the globe, have rapidly changing learning needs that require agile learners and agile interventions.
The cost of learners traveling to events off-site, in terms of time, money, and effort, is not worth the investment if the goal is improved organizational performance.
Programs that are designed for large numbers of employees are not designed to fit the different learning styles and content needs of different learners in different situations. While intending to be efficient and cost-effective, these programs fail to meet everyone’s needs.
Because these programs are removed from the day-to-day activities of the workplace, they lack relevancy.
The way to address the failure of traditional, instructor-led training is to end our reliance on these programs and develop a culture in organizations that supports continuous and agile learning. This kind of culture integrates many different methods of learning (including mentoring and coaching) into the daily activities of the organization. In this kind of culture, learning from action is highly valued. People, in the course of their workday activities, are taking risks, reflecting on their experiences, and using that awareness to improve performance. In this kind of culture, formal training events only occur when it is determined that this is the best method of learning given the learning content and goals of the organization.
According to Robert W. Goldfarb, columnist for the New York Times, companies should give opportunities to more college graduates who come to them without the needed technical skills. He writes:
…employers should consider accepting some responsibility for introducing young people into the work force. This could be the perfect time for companies to start pilot projects that enroll unskilled but promising people in corporate training programs.
Goldfarb is implying that companies know how to train employees and, therefore, they are equipped to turn liberal arts graduates into high-demand workers. He suggests that corporate training programs can effectively develop skill sets such as “…six-sigma analysis, supply-chain procedures, customer service, inventory control, quality assurance and Internet marketing.” While I appreciate the intent of Goldfarb’s message, I’m afraid that he is attributing much more capability to corporations than is warranted.
The reality is that employee training in most companies is not very effective. The best estimates are that only 15% to 20% of learners apply newly acquired knowledge and skills in the workplace. It’s not that instruction is poor; it’s that organizations put up barriers to learning and application of learning. Employee development is not encouraged, not supported, and not expected either from old or new hires. Professional trainers that are good at delivering instruction are not successful ensuring application of that learning.
These organizations think of training and development of employees as something that happens in events (workshops, seminars, online courses, etc.) when, in fact, most learning occurs in other ways. Hallely Azulay uses the 70-20-10 rule: 70% of employee learning happens on-the-job; 20% happens from interaction with colleagues and friends; and only 10% is from formal training programs. If companies want to develop liberal arts grads into productive technical workers, they must be intentional about the learning that occurs the other 90% of the time and ensure that learning is applied in the workplace.
With work changing so fast, the most important thing for employees to learn is how to learn. In the face of a steady stream of new knowledge and skills, employees can't rely on what they already know. They need to know how to constantly acquire new knowledge and skills. Liberal arts graduates tend to have this ability more so than graduates with technical degrees. If companies want to take advantage of this talent, they will need to learn how to do a better job of developing these employees.
So, while I agree with Goldfarb that liberal arts degree graduates have something to contribute to most organizations, I don’t think we can rely on companies to prepare them for the jobs that need to be filled.
Chief Learning Officers who belong to the Linkedin Learning,
Education and Training Professionals Group were asked by Jason Silberman to
describe their three biggest “pain points”.
Silberman wrote, “What makes you emotional - what makes you want to punch a
While not a scientific survey of Learning Officers, the 97
comments (to date) give us an indication of the kinds of issues that trouble learning
leaders in organizations. I’m especially interested in knowing the challenges
of these leaders because I’m co-founder of Learning to be Great™, an online
marketplace designed to connect leaders with tools and experts who can help
them be successful in their jobs.
After reading through the comments by members of the
Linkedin group, I identified eight major themes. CLOs worry about…
Lack of organization-wide understanding of the purpose and intended results of a program.
Managers not buying-in to the goals; learners not knowing why they were asked to participate; leaders not seeing the “line of sight” from learning interventions to performance outcomes.
Not knowing what results to expect from learning interventions, whether designed internally or purchased from vendors.
Not having the right training professionals that can provide learning interventions to help the organization be successful. Current
training and development staff do not have the competencies needed in their organizations as they are today.
Managers and learners not committed to
organizational learning and the learning interventions needed to improve
performance. Managers not providing the attention and support that learners need.
Lack of accountability for what happens before
and after training that supports learning. Managers not preparing learners and
not following-up after the program is over.
Top leadership not valuing employee learning.
Their expectations are low and this translates into little involvement and
support for learning interventions. They consider training to be a cost, not an
Inadequate design and delivery of learning
programs. Not using technologies that could facilitate learning. Not matching
content with method.
Lack of employee commitment to their own
learning and development. Employees not making optimal use of the learning resources that are offered to them.
One observation that is striking about this list is the
absence of a need for more resources. It seems that the “pain” does not come from a lack of
time and money but rather how time and money is used. CLOs worry about wasting
the resources they have, not trying to acquire more resources.
In his blog post, “Stop Evaluating Training!”, Amit Garg summarizes
a presentation that Robert O. Brinkerhoff gave at the Australian Institute of
Training and Development conference in April. Garg relates Brinkerhoff’s
comments to the challenges of measuring the effectiveness of elearning
programs. Garg writes:
So how do you
evaluate the success of eLearning that you create?" As a learning
solutions vendor, I’ve been asked this question countless times and have also
encountered it in many an RFP. Proving the effectiveness of training and
showing ROI is no walk in the park and still keeps L&D up at night.
Maybe evaluation of the impact of training is not a “walk in the park”, but it
is not “rocket science” either. As Brinkerhoff explains in The Success Case
Method, there is a simple logic to assessing impact. First we have to
understand the alignment between a learning intervention, whether that’s
classroom-based, elearning, coaching, on-the-job structured experiences, or
something else, and the intended impact (e.g., customer retention, production, sales, revenue, market share). Then it’s a matter of identifying and interviewing learners who
are contributing to that impact and those that are not contributing. The
purpose of these interviews is to understand the nature of the impact and why
it’s happening or not happening.
Three immutable laws drive this method of evaluation. One is
the law of learning as a process not an
event. This law holds that learning is a process that
starts before an instructional event and continues after the event. To study
the event only is to fail to include key aspects of what is facilitating
learning and what are barriers to learning.
Another is the law of unintended consequences which holds that
when there is change there are always outcomes that weren’t anticipated. A
useful evaluation of training will examine these unintended consequences (positive and negative) as well as achievement of objectives of the intervention.
And the third law holds that significant, lasting change in complex
organizations is always the result of many factors, some that are not in the control of trainers and learners. For example, effectiveness of leaders,
change in strategic direction, and the economy could have a profound influence on the impact that learners have on their organizations.
It’s not complicated. Understand how the learning
intervention is intended to help the organization achieve its goals. Identify
employees who are and are not applying what they learned. Interview these
employees to find out what they did or did not do, how that affected the
organization, what other factors were intervening in that impact, and what can
be done to increase learner impact in the future.