As we progress through the Point-of-Work Assessment, in previous snippets we’ve considered performance attribute clusters specific to Environment & Culture – People & Capability – Workflows & Processes. Hopefully, you’ve seen a degree of overlap and interdependencies among these three clusters. This snippet describes the importance of Content & Resources the Knowledge Worker relies upon to optimize their individual performance at the task-centric and role-specific levels. Assessing attributes across critical Content & Resources overlaps directly with Workflows & Processes although is more specific to efficient accessibility and effective application at the Moment of Need at the Point-of-Work. Why? Would you stop at assessing the toolbox and not include the tools inside?
The first consideration of significance deals with what form Content & Resources take. When we consider how Knowledge Workers accomplish their jobs, we must determine how readily they can get their hands on the right assets (tools) at the right time…going back to a couple of the 7-Right Things mentioned earlier. What that list of tools includes may not always be content-based. Consider:
- Job aids, and checklists
- Contextually-delivered step-by-step instructions – Pulled at Moment of Need
- Contextually-delivered step-by-step instructions – Pushed when things have changed, or limits have been exceeded or overlooked
- Policy documents, Methods & Procedures (M&Ps) or Standard Ops Procedures (SOPs)
- Compliance Guidelines, Rules of Engagement
- Direct contact to Subject-Matter Experts and/or Business-Matter Experts
- Collaborations with peers, project team members
- Active Supervision and Feedback Loops
- Confidence in asset currency and accuracy
You may be able to think of others, but regardless of what may be missing from the list above, there are several commonalities relative to all as shown in Figure 1.
Legacy Content & Resources
Current state Content & Resources inventory establishes a benchmark of what is in place and used today. Several questions come to mind:
- Are these the right assets?
- What’s missing?
- What should be added?
- What’s not useful?
Of the 7-Right Things you may recall that Accessibility is at the top of the list. What good are the right assets if they are not readily accessible? What other factors are restraining or wasting productivity specific to accessing the assets?
- By whom should they be accessible?
- What are the work conditions and urgency at the right moment of need?
- How difficult is searching for and finding the right assets?
- How efficient is interfacing/collaborating with the right human assets?
- Does the right technology exist to enable access at the Moment of Need?
- What is the cost to the business when access is restrained or nonexistent?
Typically, at the Moment of Need there likely exist degrees of implied urgency to APPLY the asset to resolve the Moment. What are the business implications when that urgency is not factored into design and delivery of the assets? Do we need instructions to build a watch to simply tell what time it is? Relevance to telling time is one thing, but what about relevance to telling time under the physical, or the geographic, or the network connectivity constraints of the workflow?
Task-level centricity is at the core of Intentional Design where intentionality is framed by the task(s) to be accomplished and the conditions under which that must happen. The content design of the resource must be formatted and relevant and directly in lockstep with the task, and that cannot happen without first determining the nature and complexity of the task at the Point-of-Work.
At first glance Effectiveness sounds much like Relevance, and in some respects, they are related; however, Effectiveness is borne out by something else – Results. Effectiveness may also be measured by active Feedback loops between the Knowledge Workers attempting to APPLY the assets at the Moment of Need and the Asset Owner(s). Does that feedback loop exist?
Here we see the interdependency with attribute clusters Impact & Analytics as well as Systems & Technology. Do we have visibility to asset utilization that are co-related with actual performance at the Point-of-Work? Is that level of visibility even possible in the current state?
This is a big one. We do a yeoman’s job of creating content and providing resources, but do we have effective Sustainability protocols in place?
- What is the current process of updating content? How long does that take?
- How many generations (versions) of the original content exist and where are they?
- How much of the same content is embedded in PDFs, PPTs, Training content, etc.
- How are changes communicated to the Knowledge Workers?
- Are notifications pushed directly into contextual workflows to protect time on-task?
- Do existing resources need to be pulled out of service to perform updates?
- How many approval/review gates do updates have to clear before redeployment?
- What’s the cost to the business for delays to perform updates on outdated content?
- Does technology exist that enables rapid development and updating of these assets?
- Are we able to measure if our sustainability protocols are even sustainable?
Pretty scary, huh? Such are the dynamics of Learning Performance Ecosystems. Hopefully, this shines the light on the urgency to move beyond the limits of our Training Paradigm given most, if not all, of these moving parts related to Point-of-Work are down-stream, post-training and out of scope. And hopefully, it becomes more apparent that adopting a Learning Performance Paradigm is a viable way to broaden our scope to include the dynamics of the Point-of-Work.
Thanks again for reading. As always, I welcome thoughts, comments, ideas, and/or push-back as you are so moved.
Take good care!