POINT-of-WORK: Strategic Transformational Change VS. Tactical Redeployment

Point-of-Work Dynamics

For the last ten years of conference presentations, I’ve walked away from each with positive reinforcement, buoyed by live comments and positive post-session evaluation scores. Receiving verbatim comments on evaluations like “Best session of this conference!” make a body feel good…especially after being plugged into a last-day time slot. That kind of feedback tells me the message promoting the pursuit of adopting a Point-of-Work Discipline is the right one. However, I decided the right message is shared with the wrong audience. Not “wrong” in the sense of their tactical ability or ability to understand the message by the session participants, but wrong from the perspective of the audience NOT being empowered to drive strategic transformational change.

Transformational Change, based upon a strategic shift of this magnitude, impacts the entire enterprise not just L&D and requires active, engaged sponsorship that enables empowerment beyond internal ideation. The Point-of-Work Discipline involves five phases: (See Figure 1)

  • Discovery – holistic discovery across the learning performance ecosystem to enable effective solution design
  • Deploy – deployment of tested and validated solutions incrementally into the workflow
  • Implement – implementation through supporting applications in the workflow
  • Adopt – Adoption of changes where routine usage integrates with daily work
  • Sustain – Protocols and processes in place to maintain changes critical to promoting currency

Point-of-Work DynamicsFigure 1

In an attempt to target a higher-level audience where strategic transformational change is routinely sponsored and then promoted as part of a blended solution strategy mix, I submitted a query to the CLO Symposium series. Mike Prokopeak responded with a message that confirmed my query topic was “on target for his audience and timely”, but because I was considered a vendor, my query was respectfully declined. I respect CLO Magazine’s position to not include vendors into the mix of speakers, and while Mike’s response was disappointing, at the same time, it validated the topic of Point-of-Work as being on the mark. With the right ears, the right scope of authority and vision, the Point-of-Work message will be better positioned to plant seeds of Change critical to the evolution of strategy.

Are there tactical implications? Absolutely, but tactical redeployment of skill sets, discovery/assessment methods, and potential/eventual inclusion of productivity acceleration technology to accommodate scale at some point represent a recipe for a difficult and/or failed initiative. Strategic commitment must open the door to Change and empower the tactical implications in a phased adoption initiative.

From a strategic point of view, Point-of-Work should not be viewed as a tactical destination for learning solutions; rather, Point-of-Work should be embraced as a tipping point for Value Generation…generation of sustainable workforce performance…generation of business value…or scrutinized as a place where value generation is compromised by deficient performance…or restrained productivity…or tangible loss…or business liability…all of which have the potential to directly impact the bottom-line. The good, bad and ugly only manifest at Point-of-Work and a sound strategy should be an enabling strategy with evolved tactics and technology as defined by the unique attributes of the ecosystem.

Is it as simple as evolving strategy alone? I think not. There are cultural and environmental engagement interdependencies interwoven through any transformational change. These interdependencies exist in every dynamic learning performance ecosystem…and yes…every organization owns a dynamic learning performance ecosystem. The question to consider is “How optimized is ours?”

We (L&D) have made enormous strides with learning by shrinking and speeding deployment with micro-learning, integrating adaptive learning, improving the learning experience…chasing learning in the workflow (which is still struggling to be properly implemented by some)…so there’s no question that L&D has the learning part of the ecosystem nailed.

However, we are not scoped, chartered or equipped to optimize a dynamic learning performance ecosystem. Why? Because emphasis and tactics still remain fixed on learning solutions. We create “potential” with learning…but we generate “business results” from performance at Point-of-Work. And we enable performanceby addressing moments of need that accelerate productivity…by reducing errors…and eliminating redundancies… and more…to ensure delivery of measurable performance outcomes.

Does learning take place while working? Certainly, it does, but it’s not the end-game – performance results are the end-game – and if we do not achieve the results what value is there in perfectly architected learning? Are the assets and support designed as learning? Not always. That’s precisely why I’ve coined the phrase learning performance ecosystem…not learning AND performance ecosystem. I’m suggesting we use learning as a VERB not a NOUN. We are “learning performance” by performing…at Point-of-Work and during moments of need. We should be enabling Performance first and foremost. We cannot split the two…we must enable convergence with the emphasis being squarely anchored to and measured by performance outcomes. Productivity Acceleration Technology does this efficiently and effectively when adoption scales.

Some might say I’m splitting hairs here, and maybe I am, but what I’m NOT splitting is learning from performance. If learning turns out to be a by-product of success in the workflow, great. Methinks this is about perspective and intent…and that, to me at least, means being zeroed in on Performance results, and it’s a mindset we should strive to adopt.

Here’s a perfect example where the true source of productivity restraint falls outside the scope for a requested learning solution. In a recent workshop, the performance support specialist I worked with received a task to improve training for the technical support function. Her assessment revealed a number of recommendations…some training-related and some not. One of the biggest obstacles restraining productivity had nothing to do with deficient knowledge or skills, but turned out to be an organizational design issue compromising a workflow with unnecessary delays…which in turn impacted client response time…which in turn led to degraded customer satisfaction scores…which led to job frustration of very capable knowledge workers…which led to turnover…which led to a recruiting scramble to back-fill highly-talented and skilled roles with newbies lacking the critical IP knowledge that just walked out the door. Talk about a domino effect…interdependencies do that all day long and too often perpetuated by a well-intended request for more training.

Did the assessment reveal the need for additional learning? Surely it did, but in the scope of this Specialist’s new role she prevented failure of a requested learning solution whose success would be based upon the expectation that training would fix poor performance in the target group. Yes, there would likely have been some incremental improvements accomplished, and the Learning Box would get a checkmark, but the Performance Box would likely not fare so well…certainly not be optimized. But what are the chances of optimization being overlooked if strategy and tactics were focused only on the learning part of the solution? That eventuality is what kills us if we have not enabled holistic assessment of the entire ecosystem’s performance attributes.

This example clearly shows how the interdependencies in a dynamic ecosystem set up a potential trap if a performance mindset fails to engage assessment to learn why performance was restrained in the beginning. Would the OD intervention in the example even have been uncovered? Does L&D own OD redesign? Nope…but the Point-of-Work Assessment (PWA) revealed a trigger to bring OD specialists into the solution mix. This points to another facet of interdependency…including other resources and specialties who collaborate to optimize an ecosystem solution…not just a learning solution.

I’ve experienced this phenomenon myself where a training request changed dramatically AFTER the PWA revealed part of the performance restrainers required a Six Sigma team to pursue process improvement interventions…and…a 3rd party attitudes & values survey that flagged the presence of cultural challenges…and…development of a Leadership Academy to embed a repeatable Change Leadership methodology…and yes, there turned out to be additional training, but even the majority of that training was borne specific out of what the Six Sigma process redesign required.

Closing Thoughts

Right message. Wrong audience. That’s not a statement of L&D’s failure…it’s mine…I own it…for not seeing it sooner. I’m toying with changing my title/label/role from Workforce Performance Strategist that I’ve carried since Xerox Global Learning Strategy Group days. I don’t want to be seen as a Vendor because I have no product to sell…though productivity acceleration technology may be a future requirement. I have no service or software to sell…beyond a PWA workshop and coaching. I’m flirting with Change Agent as a better fit…maybe Advocate for Knowledge Worker Success. That’s accurate but likely too many words…who knows, just don’t call me vendor…

What the role/label becomes, the mindset will not change…I’ll stick with Learning Performance Impact at Point-of-Work as the ultimate ground zero for my obsession.

Thanks for reading, and as always, if comments or ideas, please share. If questions or clarifications are needed, just ping me.

Take good care!


Gary G. Wise
Workforce Performance Advocate, Coach, Speaker
(317) 437-2555
Web: Living In Learning