Aibrary Logo
Podcast thumbnail

Evidence-Informed Learning Design

13 min

Creating training to improve performance

Introduction

Narrator: An article appears online titled, "How to Design Learning Experiences for Millennials." It claims this generation of "digital natives" learns differently, preferring hands-on experiences and, above all, video. It even cites a study claiming video is 80% more effective than any other medium. For a learning professional trying to engage a younger workforce, this sounds like a godsend—a clear, actionable roadmap. But a closer look reveals vague language, emotional appeals, and hyped-up claims with no verifiable evidence. The "studies" are from vaguely named research centers, and the advice is built on a foundation of stereotypes. This kind of plausible-sounding but ultimately hollow advice is a pervasive problem in the world of corporate training and education. In their book, Evidence-Informed Learning Design, authors Mirjam Neelen and Paul A. Kirschner argue that the learning profession is built on a cracked foundation, and the only way to fix it is to stop relying on myths and intuition and start building with the hard-won evidence from the learning sciences.

The Cracked Foundation: Why the Learning Profession Must Look in the Mirror

Key Insight 1

Narrator: The authors deliver a stark diagnosis: the learning and development (L&D) profession is "cracked at its foundation." For decades, a vast body of scientific research has uncovered how people actually learn, yet these findings are frequently ignored in practice. Instead, the field often chases trends, engages in superficial debates over terminology, and relies on intuition and deeply entrenched myths. A prime example of this disconnect is Project Follow Through, a massive US government study from the 1970s. It was designed to find the most effective teaching model for disadvantaged children. The results were overwhelmingly clear: a method called Direct Instruction, which involves carefully planned, teacher-led lessons, produced the best results in academic achievement and cognitive skills. Yet, due to ideological biases in favor of discovery-based learning, the results were largely ignored. This story illustrates a core problem: even when presented with strong evidence, the profession often defaults to pre-existing beliefs. Neelen and Kirschner argue that to be effective and credible, learning professionals must stop closing their eyes to reality and start making decisions based on objective facts, not just opinions.

Beyond Buzzwords: Adopting an Evidence-Informed Mindset

Key Insight 2

Narrator: To fix the foundation, the authors advocate for an "evidence-informed" approach, making a crucial distinction from an "evidence-based" one. The term "evidence-based" comes from medicine, where it’s famously described as a three-legged stool: the best available research, the professional's clinical expertise, and the patient's values and preferences. While this is the gold standard, learning environments are far more complex and messy than a clinical trial. It's difficult to control all the variables, and success isn't always easy to measure. Therefore, an "evidence-informed" approach is more realistic. It still holds scientific research as its core, but it acknowledges the need to blend that evidence with the practical wisdom of the learning professional, the specific context of the organization, and the needs of the learners and stakeholders. It’s about using evidence to make better decisions, not to find a single, perfect formula that works every time for everyone.

The Truthiness Sieve: How to Separate Fact from Fad

Key Insight 3

Narrator: In a world of "truthiness"—ideas that feel true but lack factual support—learning professionals need a reliable filter. The authors introduce Stephen Gorard's "sieve," a framework for judging the trustworthiness of research. This involves asking critical questions about a study's design, scale, dropout rates, and data quality. For example, a company might publish a study claiming that making learning content available on mobile devices increased customer satisfaction by 25%. On the surface, this sounds impressive. But using the sieve, a critical professional would ask: Was there a control group that didn't get mobile access? Was the sample size large enough to be meaningful? Did many people drop out of the study, skewing the results? Was customer satisfaction measured in a reliable way? And most importantly, did the researchers check if the employees in the intervention group actually used the mobile content? Without solid answers to these questions, the 25% claim is just a seductive number, not trustworthy evidence.

Slaying the Zombies: Debunking Persistent Learning Myths

Key Insight 4

Narrator: Learning myths are like zombies: they are faulty ideas that refuse to die, no matter how much evidence is thrown at them. The book tackles several, including the myth of learning styles and the idea that Google can replace human knowledge. The latter is particularly dangerous. While search engines provide incredible access to information, they cannot replace the deep, domain-specific knowledge needed for complex skills. Consider Barry, a consultant working with a shipping company, Sailing Home, to solve a deep-seated lack of trust in their supply chain. Barry can't simply Google "how to build trust." He needs to analyze the industry, interview partners, conduct a root cause analysis, and facilitate difficult conversations. This requires a rich, integrated web of knowledge about supply chains, business relationships, and human psychology—something a search engine cannot provide. True expertise is built in the brain, not stored in the cloud.

Designing for Impact: The Three-Star, Whole-Task Approach

Key Insight 5

Narrator: So what does good design look like? The authors propose a "three-star" model: learning experiences should be effective (they achieve the goal), efficient (they don't waste time), and enjoyable (they are motivating). To achieve this, they advocate for a "whole-task" approach. Instead of breaking skills down into isolated fragments, training should be built around authentic, real-world tasks. A powerful example comes from the European Patent Office (EPO), which needed to train new patent examiners. Instead of teaching rules in a vacuum, the designers first studied what expert examiners actually do. They discovered experts don't read applications linearly; they "smell" them, looking for key elements. The new training program was built around a sequence of real patent applications, starting simple and growing in complexity. Trainees worked on these whole tasks, supported by a mix of formal instruction and on-the-job guidance from supervisors. This holistic approach ensured that learners were not just memorizing information, but building the integrated skills needed for the job.

The Essential Ingredients: Proven Techniques for Real Learning

Key Insight 6

Narrator: Effective learning design relies on a pantry of proven ingredients. The book highlights several, including worked examples, spaced learning, and retrieval practice. A worked example is like a recipe: it provides a step-by-step guide to solving a problem, which is incredibly effective for novices. Spaced learning counters the natural human tendency to forget, as shown by the Ebbinghaus forgetting curve. By revisiting information in short, repeated sessions over time, memory traces are strengthened. Finally, retrieval practice—the act of actively pulling information out of memory, like with a quiz or self-explanation—is far more powerful for long-term retention than simply re-reading or highlighting. These ingredients aren't fads; they are simple, evidence-backed techniques that dramatically improve learning outcomes.

The Learner in the Driver's Seat: Navigating Self-Directed Learning

Key Insight 7

Narrator: Ultimately, the goal is to foster learners who can direct and regulate their own development. The authors distinguish between Self-Directed Learning (SDL), which is the macro-level process of setting goals and choosing tasks, and Self-Regulated Learning (SRL), which is the micro-level process of planning, monitoring, and adjusting during a specific task. A simple example from a hairdressing program clarifies this: a student's SDL goal might be to "improve blow-drying on long hair." The SRL process is how they execute that task: orienting to the hair type, planning the steps, monitoring their work, and making adjustments. However, this is not easy. Many people are held back by the Dunning-Kruger effect, where they are too unskilled to recognize their own incompetence, making it impossible to plan or monitor their learning effectively.

From Independence to Interdependence: Scaffolding the Self-Directed Learner

Key Insight 8

Narrator: Because self-direction is so difficult, the authors argue that simply leaving learners to their own devices is a recipe for failure. Instead, learning professionals must provide "scaffolding"—temporary support that is gradually removed as the learner becomes more capable. This support can take many forms. One study used a simple app called "Learning Moments" that prompted hospital staff and students daily with the question, "Did you learn anything today?" Initially, many said no. But with hints like, "Maybe you had an interesting discussion?" or "Perhaps something went wrong?", they began to recognize learning in their daily experiences. Many participants concluded, "Wow, I learned a lot more than I thought." This simple intervention helped them become more aware of their own learning. This demonstrates that supporting self-directed learning isn't about being hands-off; it's about providing the right tools, prompts, and guidance to help learners build the awareness and skills they need to succeed on their own.

Conclusion

Narrator: The single most important takeaway from Evidence-Informed Learning Design is a call for a professional reckoning. The field of learning and development must evolve from a practice based on folklore, fads, and gut feelings to a discipline grounded in the robust evidence of the learning sciences. This means abandoning ineffective "zombie" myths like learning styles, critically evaluating new trends before adopting them, and designing experiences that are proven to be effective, efficient, and enjoyable.

The challenge this book presents is not just to learn a few new techniques, but to fundamentally shift one's professional identity—from an order-taker who delivers training to an evidence-informed consultant who solves real performance problems. The most practical first step is to ask a difficult question about a current practice: "What is the evidence that this actually works?" The answer may be the first step toward building on a much stronger foundation.

00:00/00:00