PERFORMANCE SUPPORT: Questions, Questions & More Questions

Momentum to accept the validity of performance support has been a long time coming as evidenced by a dramatic increase of articles advocating more L&D attention targeting performance outcomes. Unfortunately, the attention given is still largely skewed toward innovative learning solutions in the training environment. That’s not altogether a bad thing; however, it’s not enough when the measurable results we seek are found at Point-of-Work where the key driver is effective application of performance support. Methinks part of that dilemma is a result not of the discovery questions asked, but those unasked.

If the assessments we accomplish are limited to knowledge and skills required to do the job, that’s all we will ever be able to confirm. What we often overlook are the questions about why the job is not being accomplished, and to discover that information, different assessment questions must be asked.

I can’t give away all of my Point-of-Work Assessment- PoWA secrets since those questions are what I use to teach L&D teams to ask, but there are a few questions I’d like to share that frame what should be asked to confirm that performance support is not viewed by L&D simply as post-training support mechanisms.

Additionally, these questions can help determine if the organization and L&D are merely “ready” to pursue performance support integration or are at a “state of readiness” to create a sustainable strategy and tactics. To that end, I’d like to share a few questions that should be considered that will confirm performance support should be integrated across the entire Learning Performance Ecosystem from Point-of-Entry to the Point-of-Work. See figure 1.


Figure 1

Typically, I would not ask these questions until an executive level presentation and discussion have taken place. The following questions actually serve as discussion triggers to define and align interest in pursuing the PoWA within targeted work groups. Following are several qualifying questions to consider…

Environmental & Cultural Alignment

  • Across industry segments we see a shift toward a more learner and performer-centric paradigm to create sustainable workforce capability. This shift requires moving beyond reliance on traditional training programs and methodologies to serve as sole sources of equipping the workforce to perform effectively at the post-training Point-of-Work. For example – How would you assess [company name] willingness to consider a shift beyond long-held training traditions we just discussed?
  • There must be an interest in adopting a performance support integration strategy, or this opportunity to meet would not happen. Example – What business driver(s) prompted the interest in considering performance support integration?
  • What do you hope to accomplish by adopting a performance support strategy?

Obviously there are more questions to quantify motivation; however, the intent here is to confirm commitment that the interest is backed up with measurable targets to serve as evidence of successful impact.


  • The intent in this phase of the PoWA is to identify the diverse role synergies across the Learning Performance Ecosystem to target primary and secondary roles where performance sustainability challenges exist and to prioritize a viable pilot program to validate larger implementation. As an example – What operational areas do you feel training programs are not consistently enabling sustainable performance?
  • Keep in mind the individual workforce performer is not our only concern. For example – How are managers, coaches, mentors, and Help Desk staff equipped to support the workforce at the Point-of-Work?

Again, there are more questions to ask but the intent is to shift the thinking to performance and what happens at Point-of-Work in a post-training environment. I don’t dwell on what skills are required or competencies at this point. That comes later in subsequent PoWA interviews.


  • The intent in this phase of assessment is to identify internal processes and L&P methodologies utilized specific to learning, identification of performance challenges, prioritization of intervention decisions, and other process-related workflow challenges. For example – How are performance challenges identified and prioritized for action; and what is L&D’s primary role?
  • The workflows and methodologies employed by L&D are critical to understand; especially since the PoWA is targeting readiness of the L&D function. For example – Describe the documented, agile, L&D design methodology currently utilized.

Notice I’m not asking for specifics related to broken workflows with this audience. The focus is on how L&D accomplishes discovery to identify those challenges and what they do when they uncover them. More specific workflow questions are required.


  • The intent in this phase of assessment is to identify current state related to how legacy content is used across multiple forms ranging from formal training to informal learning, information and any other business resource accessibility required to support performance at the point-of-work. For example – Describe extent to which training content is designed to accommodate moments of need at the Point-of-Work.
  • Determining L&D’s current state of content utilization and creation are important steps. For example – Which agile design methodology is currently utilized and is the design methodology documented?

There are additional questions to ask; plus, they are more appropriate to ask of a less senior audience who are engaged directly with solution creation.


  • The intent in this phase of assessment is to identify the technology footprint [inventory] in place today and identify future roadmap specifics to changes in the future. This inventory includes network accessibility as well as mobility platforms, social learning, communities of practice, and collaboration capabilities. For example – When you consider your inventory of enterprise business applications used in execution of business workflows, what does the future road map look like related to changes, additions, consolidations, replacements, etc.
  • Organizations spend an incredible amount of time and money building and optimizing enterprise software systems…but what about the people who must utilize them all to drive performance outcomes? Not so much. For example – What issues do you feel exist across this collection of systems where your end-user population is most challenged?

 Yes, there are more critical questions to ask, but you must realize what we see as current state only remains current state until something changes.  And when the change involves an enterprise system, the L&D function gets stressed big time. The line of questioning here confirms a more comprehensive L&D discovery is necessary if change is on the horizon.


  • The intent in this phase of assessment is to identify how metrics [KPIs] and performance analytics are determined as evidence of competent performance and how measurement is accomplished across all learning performance support activities. For example – Define how training impact is routinely evaluated today. Does that provide actionable data? If not, what would be desired?
  • Training activity like butts-in-seats and completion data do nothing to confirm performance results. We need to confirm if performance results are even on the radar. For example – Describe the extent to which key business metrics are applied/linked to training results.

Obviously, there are more questions here as well, but defining to what extent performance analytics are used will reflect on the degree of KPI discovery that falls into the lap of L&D during their assessment efforts.

Closing Thoughts

L&D’s ability to drive sustainable workforce performance is only as good as the measurable business outcomes derived out of what is generated at Point-of-Work…or compromised…or lost outright. Being effective with learning performance solutions at Point-of-Work is only as good as the accuracy in our identification of current state and the degree of readiness to utilize and fully adopt the solution. The PoWA methodology I champion serves multiple purposes:

  • A tool to assess readiness in the L&D team to adopt a learning performance paradigm in terms of design, development, delivery, and technology utilization
  • A tool for L&D to utilize in appropriate solution responses to large and small training requests
  • A tool to confirm readiness and solution design for enterprise-wide change initiatives
  • A tool to confirm workforce readiness to effectively utilize new technology implementation to the point of sustainable full adoption

In other words, the PoWA methodology is repeatable, and scalable, and an essential tool to have in the L&D toolbox. There are several great agile design methodologies that exist where performance support plays an integral role like the Five Moments of Need and 70:20:10. The PoWA represents a solid front-end calibration of current state readiness and feeds nicely into the agile design/development models of your choice.

One last question:

Is your L&D team “ready” to pursue adoption of a learning performance paradigm
or at a “state of readiness” to adopt and sustain one?

If you would care to discuss in more detail, I’d be happy to devote time and conversation. Until then, take good care!


Gary G. Wise
Workforce Performance Advocate, Coach, Speaker 
(317) 437-2555
Web: Living In Learning