Yikes! Now that’s a visual. Maybe avalanche would be appropriate too. Enabling access to metrics and measures into the hands of those not able to answer the question “What does sustained workforce capability look like?” may very well unleash something we cannot outrun. Addressing data analytics with an eye for extracting actionable information is critical or those tasked to make informed decisions can easily and quickly drown in data or become buried in short order. Are we chasing after data as much as we are chasing Evidence of Sustained Capability (EOSC)? What data represent EOSC?
Over six years ago I posted “ROI vs. EOSC – Evidence of Sustained Capability” after surviving a data-driven exercise to produce ROI. What struck me wasn’t the reasons for chasing ROI…projecting it, actually…but what didn’t happen once we did. It turned out to be a ritual check-the-box exercise to release the funds. Methinks covertly the thinking was to ensure there was somebody to hang if the ROI promised did not materialize. It did, so I’ll never be able to confirm the darkness of my underlying fears. I realize there may be a few bean-counters out there enduring a pucker moment right now but hear the message I hope comes through in this post. “We don’t have to measure everything just because we can – have a plan tied to action.”
I introduced a couple of new acronyms in the post linked above, and with the current popularity rage of data analytics and dynamic dashboard I felt it was a perfect time to toss a consideration into the excitement. Here are the two acronyms:
ROEDT – Return on Every Damn Thing
ROWRM – Return on What Really Matters
I won’t rehash the earlier post here so click the link if you want to dig deeper. My caution offered here is simply having the ability to measure “everything” can easily go to the heads of those at the controls to do so. Don’t misunderstand me…we absolutely need the power of data to inform and drive actionable business information. What we also absolutely need to have is the ability to articulate a few key answers to these initial questions:
· What are we measuring?
· Why are we measuring it?
· What results are we looking for?
· What are we going to do with the results?
· And…what results should trigger actionable decisions?
These few questions are by no means the only ones we need to answer, but they form the basis for a thoughtful evaluation plan specific to the business impact of Learning Performance Solutions. In fact, it is the “E” Evidence in the DRIVER Discipline that represents the executable component that the “D” Discovery reveals about “What really matters” at Point-of-Work. The discovery I reference includes identifying all systems that are utilized at Point-of-Work and all the potential data sources that are accessible. Also, throw in xAPI and Learning Record Stores (LRSs) into the mix. And now we have cloud-based Point-of-Work Performance Guidance Systems enabling the ability to track most (if not all) performance data relative to actual workflows that drive business impact. Is anyone else hearing warning bells for Tsunami conditions clanging?
Pandora’s box-o-data has just been opened and there is no closing it. Now we need to focus on dealing with the tsunami of data flowing in our direction. There should be a repeatable Evidence Plan, There should be a strategic re-think and discipline in place with proven tactics to advance plan what to measure and why before we begin collecting it. I’d like to know going forward that no one gets flooded out and instead shouts of, “Surf’s Up!” frame the reaction to the impending flood.
That’s my contribution for this fine Friday the 13th.
As always, I welcome comments and thoughts for or against my ramblings at any time.
Take good care!