Post-Training Evaluation Strategies - Part 2
Curriculum architecture is core to your evaluation
Instructional Designers often have arts degrees, but not I. My bachelor's of psychology is a science degree, and I earned my master's degree from the College of Engineering at Boise State University. While I don't deny the appeal of training materials featuring high production values, that's not where training solutions draw their primary value. A big part of training systems design is creating a clean line of alignment between organizational objectives, instructional content, and the design of your organization's training systems. High-quality training systems all have one thing in common; the designers of these systems focus on alignment between business outcomes and the architectures of their solutions. This design-oriented approach also allows for a tighter alignment when it comes time to implement your evaluation strategy. In this post, we will explore this topic as Part 2 in my three-part series about training program evaluation.
Design-thinking applied to training systems specifications
We training biz-types refer to this tight alignment between training systems design and business outcomes as, "performance-based thinking." For my brothers and sisters in the field this is nothing new, but for newcomers to the world of performance-based training, I'd like to take a moment to unpack this term. "Performance-based" thinking is rather simple and can be summed up by this great motto. I credit the following quote to a Coast Guard admiral presenting at a conference that I attended some years ago...
This simple statement does an excellent job of summarizing performance-based thinking. Develop a training system that trains people on the skills you need to improve your organizational effectiveness. No matter if your goal is to make money or save lives, performance-based thinking is the only way to invest your training budget wisely. I very intentionally quoted a Coast Guard admiral because The Guard have been long-standing proponents of performance-based training systems and have multiple awards in this area:
Later in this blog post, we will talk about the Coast Guard's application of performance-based thinking to the development of their award-winning employee training system.
Performance-based training is not a new training philosophy
As early as the mid-90s experts in the development of training systems advocated for creating a close link between business outcomes and the content of training solutions (Dick & Carey, 1996). Dick & Carey's 1996 book is considered a classic in the training field, but I am sure my more scholarly readers can suggest references that pre-date Dick & Carey by some years. Graduate-level students still learn Dick & Carey's strategic approach, and I advocate for this best practice in my work with clients. I point this out because I am not the only one to talk about this topic; trust me, nothing I am saying is revolutionary. There are too many university-level texts that teach performance-based thinking for me to list. Additionally, the sheer volume of academic work written about performance-based thinking is tremendous. The International Society for Performance Improvement (ISPI) and the Association for Talent Development (ATD) both offer certifications in the Performance Improvement discipline.
It's not just the Coast Guard that employs this strategic approach to training systems development. Lowell's Home Improvement, Jetblue, Vertex Pharmaceuticals, Amgen, Imperial Oil, and Lockheed-Martin (just to name a few) have all received recent recognition by ISPI for their application of performance-based thinking when developing training and performance improvement solutions. Performance-based thinking is good common business sense.
If performance-based thinking is standard, why am I not familiar with it?
Well, I have a few personal theories about this:
Training professionals fail at selling performance-based training
Training and development professionals are typically bad at marketing and selling their philosophies, just like many engineers, because many of them are engineers of a sort. However, this is not entirely their fault. Many organizations undervalue training despite knowing how critical it is. Lack of gravitas on the part of training professionals is one of the biggest hurdles our field faces when selling the value of performance-based training.
Human systems are complex & messy
The systems we design are often complex and involve human beings; so unlike a computer system that can quickly spit out performance metrics, human systems are not so easily configured. Precise and accurate data collection can be cost prohibitive for some organizations, but there are economical ways to compensate for this limitation.
Business people tend to overvalue the lecture-format we are all familiar with from primary, secondary and higher education. Sadly, your average business professional doesn't appreciate how poor the general quality of their college education was. I can't tell you the number of times I've encountered business leaders say, "I know how to develop a training program," only to hear them describe a program that sounds like a bad TED-talks knock-off. This is not training, this is liberal arts education.
These three theories converge to produce a very natural human outcome: business leaders aren't aware that there is a better way to develop training systems. Because they don’t know how any better, they request solutions that match their experience and current understanding. They insist they know better, but this is largely because they can’t understand the techno-babble most training experts notorious for using with clients. Sadly, this hubris represents the most challenging part of implementing performance-based training. Applying a performance-based training philosophy in an organization where your primary business sponsor has a lot of preconceived notions about training can be a real bear. I am fortunate that most of my clients have been open to learning about performance-based thinking, but I have also had a few clients who miss the boat entirely.
Well then mister smarty pants, how should we design training systems?
Earlier in this post, I used the term, "curriculum architecture." This term is used by many other training systems developers, but I first learned about this concept from Steve Villachica of Boise State University. He assigned my Instructional Design class Ruth Clark's (2008) Building Expertise: Cognitive Methods for Training and Performance Improvement. While Clark (2008) describes, "instructional architectures," other experts in the field have defined curriculum architecture in other ways. Populouz describes a curriculum architecture in the following way:
I will write other posts that break down this definition in greater detail, but I want to start with the idea of performance and learning objectives.
Performance & learning objectives: the backbone of your training solution
Let's return to the U.S. Coast Guard and how they start the process of developing a training course. The Training Systems Standard Operating Procedure for the Coast Guard provides highly detailed instructions on the scoping out a curriculum architecture. When I worked for the U.S. Coast Guard as a Senior Performance Analyst at the Recruit Training Center in Cape May, NJ one of my first assignments was to develop a performance-based evaluation system for the Recruit Training program. I will detail my strategy for developing this system in Part 3 of this series, but for now, I want to explain how the performance-based architecture of their training system allowed for this.
The Coast Guard requires each training systems designer to develop a set of Terminal Performance Objectives and Enabling Objectives. The Coast Guard shortens these to TPOs and EOs in their everyday business conversations. When a training systems designers start developing solution specifications they always start by interviewing stakeholders and exemplary employees we call Subject Matter Experts or SMEs for short. There are many outcomes from these discussions, but one of the most significant results is a list of TPOs and their subordinate EOs. Populouz uses this same format, but we've adapted it to support Agile Project management strategies that rely on progressive elaboration and rapid iterations rather than front-loading all of the analysis for a given project.
Let’s look at an example…
“As an aspiring yoga instructor, I need to select the best place for my camera so my cat can't photobomb my YouTube video.”
Let's explore an example of a high-quality performance objective and a sub-par objective and the reasons why they are sub-par. First, let's look at a high-quality performance objective and break down the example.
Anyone familiar with Agile Software Development will recognize this format as a User Story. Much to the dismay of my training brothers and sisters, I use the User Story format for writing my performance objectives instead of more traditional formats because it accomplishes most of things a traditional three- or four-part instructional objective tries to archive, but without the often forced and awkward wording.
It identifies the employee performing the work, which helps identify the audience you need to analyze throughout the process, their motivations, and even some essential characteristics
It defines the exact thing the performer needs to do to achieve something that adds value
It defines the desired outcome from the trainee
With this simple sentence, I can tell a great deal about my yoga instructors. I know they need some support when identifying the correct camera tripod and mount for either a DSLR or mobile phone camera. They need some instruction on selecting a location with the appropriate lighting. I can tell they need some guidance on cat wrangling to keep Mr. Kitty out of the camera frame.
From here, I can draft my secondary learning objectives. Learning objectives are pretty simple, you list out all the things that someone would need to know, or skills they need to master, to perform the work described in the performance objective. While the concept is simple, it can require a good deal of research to make sure you capture all the knowledge and skills that will enable your yoga instructor to create a kitty-free video. Let's take a look at some possible learning objectives for our example:
Plan your video according to YouTube's official recommendations for video production
Identify an available space with the best possible lighting conditions
Determine the best place for a camera tripod out of the reach of Mr. Kitty
Arrange the furniture and yoga mat to take full advantage of the lighting relative to the position of the camera
Identify the right camera tripod and mount depending on my camera model and the space in which I will record my yoga lesson
Use Mr. Kitty's favorite distraction activity to ensure he ignores the camera
I need to point out that with the performance and learning objectives in this example I can directly observe someone doing these things. I can stand in the room and watch our aspiring yoga instructor talk through space planning, writing out a script, noting the video settings YouTube recommends and even handing Mr. Kitty a catnip toy to keep him happily occupied in another room. Observability is a critical feature of a well-crafted performance objective. You can't measure or evaluate that which you cannot observe. In Part 3 of this series, I will explain why this feature is essential when it comes time to assess the impact of your instruction. However, in today's example, I want to advocate for the value this provides managers. If you can't observe your employees doing something, then you can't coach or mentor your employee to a higher level of on-the-job performance.
How to screw up a performance objective by the numbers
Now let's take a look at an all too typical example of a poorly crafted performance objective.
Understand the value of proper space selection when shooting a video.
Understand YouTube's official recommendations for video production
Appreciate the impact lighting conditions have on a video recording
What are the advantages of different camera tripods?
How furniture placement impacts your lighting
Five tips for selecting the best tripod
Appreciate the value of cat distraction activities
Let's pick this one apart.
First, good performance and learning objectives start with action-oriented verbs that are directly observable and defines performance in ways that multiple people can agree will result in value for the organization. In the example above, we are missing the person doing the work, the conditions under which the employee will perform the work, and the desired outcome driving the the work in the first place.
Second, you can't observe someone understanding the value of anything because understanding is an ephemeral cognitive function that will be subjective to the individual.
Third, interrogatives are not appropriate for learning objectives since they don't describe the needed performance or indicate what the employee should be capable of after they finish training.
Fourth, I shudder to think that any instructional designer would format a Learning Objective as a the title for a listicle-style blog post (e.g., Top 5 Cardinal Sins of Instructional Design).
I am going to wrap up Part 2 of my three-part blog post by saying that well-crafted performance objectives are the key to ensuring performance-based thinking dominates the design of your training system. In Part 3 of this series, I will explain the strategy of using performance objectives as the basis for behavioral surveys that can help training managers determine if their training programs are producing value.
Dick, W. & Carey, L. (1996). The systematic design of instruction, 4th edition. New York: Harper Collins College Publishers.
Clark, R. C. (2008). Building Expertise: Cognitive Methods for Training and Performance Improvement. John Wiley & Sons.