I wonder…if we started measuring the knowledge we transfer during training that is forgotten before our learners have the chance to apply it…if that metric would influence the decision to keep on training the way we do? Seriously, would we keep spending upwards of 80% of our training resources on creating and delivering training events? What is it going to take to step back from traditional training strategy and have a conversation about something other than effectively transferring knowledge? In reality, all we’ve done effectively is create the promise of potential.
Promise of potential? Yup…sounds more positive than offering up a deliverable of “false competency” doesn’t it? But that’s exactly what we are chartered and scoped to do as a training organization. False competency is that momentary glimmer of capability we all exhibit at the end of a training event when we’re called upon to “validate our potential” by demonstrating proficiency [demonstrating our competency] on some activity in a safe, controlled, structured learning environment or by completing a test.
All we really confirm from our training event is that knowledge was transferred to the extent we were smart enough to pass the test and earn our graduation tattoo. We know everyone leaving our class on “Digging the Perfect Hole” knows which end of a shovel to hold and the finer conceptual points of digging a very fine hole. With confidence we can say they have the potential to actually dig one. Heck, we have proof; they loved the course per our smile sheets, and they passed the test with flying colors. Houston, we have the promise of potential! We also have a high degree of false competency.
Now we hang onto the promise of potential in hopes that they can go back to their job and dig the perfect hole. If not, we see mistakes that spell missed revenue; lost accounts; creation of material waste; increased business risk and liability; or [enter the screw-up of your choice here].
What I’ve just described might seem rather cold and extreme, but that’s what we do when we rely upon our existing training paradigm to create readiness in our workforce to perform. And we keep on doing it despite no one really coming out of training with the ability to dig the perfect hole on the job.
Going back to my initial question…if we were measuring how may perfect holes were being dug after training; we might just look back upstream at our training approach to see why it is coming up short. Isn’t that also part of tradition? Whine about crappy training, right? “My people can’t dig a freaking hole…therefore the training they just completed obviously sucks.” When you step back from the training effort and rethink what we’re after it becomes very clear that the rules of engagement for training have changed. We’re measuring something that has nothing to do with the business impact we’re supposed to be supporting with a capable workforce – the digging of perfect holes.
We’re measuring demonstrable proof of effective knowledge transfer and bonzer level 1 evals; but that’s the extent of “validation” that our training event was effective. Hmmm…aren’t we after perfect holes as the outcome? Maybe we should be measuring holes. And if they are not perfect find the root cause. And after we have the root cause examine the nature of “digger” support that is needed when digging the hole. We might have to get dirty and observe the digging. We might have to get into the hole and see what it’s like at the point of digging. May WE don’t go into the hole, but the product of our design and development efforts is accessible from inside the hole at the moment of need. Call it Digger Support [Performer Support].
AND…who knows…we just might find out we did not to take 14 weeks to develop a level 3 web-based course with video and branching and interactive simulated hole digging in the first place. AND…that fact alone causes a lot of training purists to pucker and clutch storyboards to their chests like flotation devices in a water landing. It may not be a water landing, but we are going down in importance and relevance to the business mission. We’re still in the air, but we may only continue flying all the way to the scene of the crash.
That probably seems extreme too, but I’m seeing it with every down-sizing. Training is getting whacked because they are overhead. The only proof we have of value contribution is “activity”. We have no holes to point to and boast of our “digger support” prowess. We spend 14 weeks to produce an hour of e-learning the produce the promise of potential. Our stakeholders want perfect holes, not potential.
The velocity of business has out-stripped what training can do. We don’t have 14 weeks. We don’t have six. But produce a “digger support” job aid in less than one week, test through iteration via technology access and launch live to all hole diggers to access at the moment of need…from in the hole…with hand on the shovel…and you have real tangible proof of impact to brag about. The rules have changed. We have to get into the work context because that’s where convergence is taking place. We have to provide learning at the moment of need…in the hole…and that ain’t training!
Our future survivability is based upon changing along with the rules of engagement that are being driven by the convergence of learning with real work. We should be building learning and performance solutions than facilitate that convergence with the outcomes providing measurable evidence of business impact; things like time-to-first-perfect-hole. Sounds kind of like time-to-competency doesn’t it? It is, but that is HR-speak. It may matter to us, but my stakeholder is looking for perfect holes and the quicker I can get one of my performers to dig one the better off I’ll be. When they dig the perfect hole, I can measure time-to-business-impact [business stakeholder speak].
If competency is what we seek, look to performers in the context of their work and then let’s measure evidence of it in terms of flawless execution at the point of the dig. Anything prior to that is only the well-intended promise of potential. If we don’t create the right support assets for those digging the holes, the next one they dig might be for the training budget.
I just spent the week at the Learning Solutions / Ecosystem 2014 conference in Orlando. My breakout session was on Building a Dynamic Learning and Performance Ecosystem. Yeah, I know…sounds like jargon to me too…but there is a truth inside “ecosystem speak” that we cannot ignore. Seriously, the eLearning Guild felt strongly enough to launch a whole conference for Learning Ecosystems. That decision was not based on a whim. David Kelly, Director of Programs said there was no conference in existence to entertain this new paradigm and the Guild stepped up to fill the void. This concept is not going away, though we may ultimately call it something else, but for now ecosystem seems to cover the attributes and characteristics of the pieces and parts found across the entire learning environment that spans from training events up to and including performance outcomes where holes are being dug.
This “ecosystem” approach is a new vision based upon a performance paradigm. Adopting a performance paradigm includes consideration of several critical components:
- New conversations internally and with stakeholder clients
- New discovery targets beyond traditional training needs assessments
- New performance consulting skills
- New agile design and development methods
- New Web 2.0 technologies to consider
No organization can just flip a switch and change to a performance paradigm like we might flip a switch to stand up a new LMS. This is a start-small-and-scale journey that builds-in staff capability as competencies manifest with new skills and methods.
“Where to begin?” seemed to be a recurring question this week. There were plenty of folks eager to have the “What’s next conversation” because in all the session I sat through before mine, there were no clear paths articulated. I don’t think that omission was accidental. I say that because any organization choosing to adopt a performance paradigm will have a unique path defined by the state of readiness they are in when the decision is made to pursue.
In my session, I did offer a next step, and it was to establish a readiness snapshot – a Learning & Performance Readiness Assessment that clearly defines where you want to go within the boundaries of your own ecosystem, but more importantly, where you are now – your “AS IS” state of readiness.
I welcome any thoughts or dialogue I may have stimulated with this post. You can reach me on my mobile at (317) 437-2555.
Gary G. Wise
Workforce Performance Advocate, Coach, Speaker
gdogwise@gmail.com
(317) 437-2555
Web: Living In Learning
LinkedIn
All true. However, I’d go beyond criticizing the individuals who hold onto their training toolsets like life vests to criticize the entire training organization within a business. Like most support organizations within a business, training teams often operate with an outside-in approach to performance improvement: let us come into your business unit and fix your problems… and, like everyone else, we’re going to operate in a silo, which prevents collaboration with other support teams (Engineering, IT, Purchasing, Quality, Safety, etc.), because we’re all competing for the same spoils: saviour of the business!!! This outside-in approach confuses the business units with support-team jargon and overloads them with unnecessary reporting, project management and other non-value-adding administrative work while never really addressing the root cause.
By contrast, inside-out approaches like Lean Six Sigma guide the business units through assessment of their own data and then involve the support teams only as necessary to either dig deeper into the data or to support the development of solutions. In this manner the business unit is leading the way and bringing in support teams to work together, and because the business unit maintains ownership of the problem AND the solution, it deflates the inflated support team egos and forces the support teams to provide services that are truly value-added and not simply check-the-box remedies.
After years of pushing for it, our company has finally switched to this approach–a collaborative approach between the business unit and all support teams–and the results have been outstanding. So, to all those holding on to your “tried-and-true” toolsets, thinking that they’re a life preserver, I urge you to let go before they become an anchor that drowns you.
You raise a good point, James! I have Six Sigma in my background as well, and that is not only a great approach [ or most any other performance improvement discipline for that matter] to have the Training side of the org intimately involved. Training never likes having initiatives come “over the wall” with not enough time to react. Been there, done that! But if Training is a partner in the cross-discipline project team, a better and more cohesive partnership exists. Plus, being in early means there may be assets of embedded performer support that can be rolled out quickly and be tested coincident with IT Sprint team activity. Totally agree that “Inside-Out” is the preferred approach. Thanks for reading and sharing your comment, James!
G.