Get AI summaries of any video or article — Sign up free
How Knowledge Management Helped Change NASA’s Culture | APQC Webinar thumbnail

How Knowledge Management Helped Change NASA’s Culture | APQC Webinar

APQC·
5 min read

Based on APQC's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.

TL;DR

NASA’s cultural shift relied on pairing knowledge management with safety governance changes, especially “technical authority and descending opinion.”

Briefing

NASA’s culture shifted because knowledge management was built into how decisions get made after catastrophic failures—turning “lessons learned” into practical, searchable, teachable knowledge and giving engineers authority to speak up on safety-critical issues. Roger Forsgren, NASA’s chief knowledge officer from 2015 to 2021, frames knowledge management not as a culture changer by itself, but as the tool that helped NASA operationalize the cultural changes demanded by tragedy, oversight, and an aging technical workforce.

Forsgren begins with the scale and urgency: NASA employs about 18,000 civil servants with an average age of 46, supported by roughly 60,000 contractors. With tacit expertise concentrated in experienced employees, knowledge transfer becomes a risk-management problem, not just an efficiency project. He contrasts NASA’s earlier Apollo-era workforce age (about 28) with today’s younger averages at companies like SpaceX (about 29), underscoring why knowledge capture and transfer matter now.

The webinar then ties NASA’s knowledge management push to a history of both successes and failures. High-profile achievements—like the International Space Station, the James Webb Space Telescope, and the Space Launch System—sit alongside painful lessons from Apollo 1, Hubble’s mirror error, and Mars mission failures tied to unit and technical mistakes. But two accidents, Challenger (1986) and Columbia (2003), are presented as the decisive cultural inflection points. Challenger’s launch proceeded despite engineer objections about unusually cold temperatures and the failure risk of O-rings; Columbia’s disaster followed repeated foam-strike warnings that managers did not act on.

After Columbia, NASA introduced structural changes that Forsgren calls “technical authority and descending opinion.” The model gives every team member a personal responsibility to speak up when health and safety are at stake, and requires managers to listen. If consensus fails, independent subject matter experts can make the technical call—paid and positioned to be objective—rather than deferring to hierarchy. This shift is paired with governance pressure: the Aerospace Safety Advisory Panel, appointed by Congress, insisted NASA appoint a chief knowledge officer and addressed a core gap—NASA was good at gathering knowledge but weak at disseminating it.

Forsgren describes “lean Knowledge Management” as the practical implementation. NASA focused on the technical workforce, kept the program simple (avoiding academic taxonomy), and consolidated information into a single, open website: apple.nasa.gov. Instead of scattering lessons learned across many centers and servers, the one-stop portal supports searching for case studies, courses, and training.

To move tacit knowledge—expertise that lives in people’s heads—NASA fused knowledge management with training and created knowledge events designed to resonate with engineers. Case studies from Apollo 1, Challenger, and Columbia anchor coursework, while modules address critical thinking and cognitive bias in engineering decision-making. NASA also used video interviews with practitioners, podcasts (often 15–20 minutes every two weeks) to reach younger listeners, and leadership/mentorship programs that pair high-potential engineers with senior decision-makers.

Finally, Forsgren emphasizes cultural reinforcement mechanisms: a “day of remembrance” where employees revisit Apollo 1, Challenger, and Columbia, plus physical and traveling memorial-style artifacts (like the Columbia room at Kennedy Space Center and a Columbia artifacts tour for centers that can’t visit). The result, he says, is a more transparent environment where failures are treated as learning opportunities—and where engineers increasingly feel empowered to raise safety concerns and share what they learn before it becomes another preventable tragedy.

Cornell Notes

NASA’s knowledge management effort helped change culture by embedding learning into decision-making and safety governance after Challenger and Columbia. The key shift was “technical authority and descending opinion,” which requires managers to listen to safety concerns and allows independent experts to make technical calls when agreement fails. Lean Knowledge Management then operationalized that cultural goal: it targeted the technical workforce, simplified knowledge practices, and consolidated access through apple.nasa.gov. To transfer tacit expertise, NASA paired knowledge management with training, case-study-based coursework, cognitive-bias awareness, video interviews, podcasts, and mentorship programs. The approach mattered because NASA’s aging workforce and contractor-heavy operations made knowledge transfer urgent, and failures had shown that hierarchy without speaking-up can be deadly.

Why did NASA treat knowledge transfer as a cultural and safety issue rather than a documentation project?

Forsgren ties the urgency to workforce demographics and risk. NASA has about 18,000 civil servants with an average age of 46 and roughly 60,000 contractors. With experienced employees holding tacit expertise, knowledge transfer becomes necessary to prevent loss of critical know-how as people retire. The cultural angle comes from how failures revealed that knowledge existed (warnings and prior lessons) but wasn’t acted on—so dissemination and decision authority had to change, not just recordkeeping.

What did “technical authority and descending opinion” change in day-to-day project behavior?

It created a formal expectation that every team member has both the opportunity and personal responsibility to speak up when something is wrong, especially for health and safety. Managers must listen, and if agreement can’t be reached, independent subject matter experts—objective and paid by the program or project—can make the technical decision. Forsgren contrasts this with earlier patterns where hierarchy and managerial decisions overrode engineer objections.

How did NASA decide what knowledge to prioritize under “lean Knowledge Management”?

NASA started by defining its audience. While many knowledge domains exist (IT, procurement, finance, legislative affairs), the program focused on the NASA technical workforce—the people designing and building hardware. That audience focus determined the topics and reduced overload. Forsgren notes that lessons learned from procurement or legislative work may not be directly relevant to engineers working on hardware design, so filtering by audience kept the system usable.

How did NASA handle tacit knowledge differently from explicit knowledge?

Explicit knowledge was treated as transferable through media like procedures and databases—such as lessons learned repositories. Tacit knowledge (expertise in people’s heads) required mechanisms that encourage engagement and learning. NASA used training integration, knowledge events, case studies, cognitive-bias education, video interviews with practitioners, podcasts for younger audiences, and leadership/mentorship programs to make expertise transferable through interaction.

Why were podcasts and mentorship emphasized as tools for bridging generations?

Podcasts were chosen because younger employees already consume audio content during commutes and downtime, making it a practical conduit for sharing tacit lessons from experienced project leaders. Mentorship was framed as two-way learning: high-potential engineers learn how decisions are made at senior levels, while senior leaders also learn what motivates and interests newer generations—so knowledge transfer isn’t one-directional.

What cultural reinforcement mechanisms helped keep Apollo 1, Challenger, and Columbia from fading into “history”?

NASA institutionalized remembrance through a “day of remembrance,” where employees discuss Apollo 1, Challenger, and Columbia and connect those lessons to their own project experiences. It also used physical artifacts: the Columbia room at Kennedy Space Center preserves debris as an emotional learning environment, and NASA created a Columbia artifacts tour for centers that can’t visit—pairing artifacts with talks from former flight controllers to emphasize human stakes and decision-making lessons.

Review Questions

  1. How does “technical authority and descending opinion” alter the balance between hierarchy and safety concerns on NASA projects?
  2. What steps did NASA take to make knowledge accessible and usable for the technical workforce (including the role of apple.nasa.gov)?
  3. Which mechanisms were designed specifically to transfer tacit knowledge, and how do they differ from lessons learned databases?

Key Points

  1. 1

    NASA’s cultural shift relied on pairing knowledge management with safety governance changes, especially “technical authority and descending opinion.”

  2. 2

    Workforce demographics—an aging civil service and large contractor base—made tacit knowledge transfer urgent.

  3. 3

    Challenger and Columbia exposed not only technical failures but decision-making failures where warnings and objections were not acted on.

  4. 4

    Lean Knowledge Management prioritized the technical workforce, simplified knowledge practices, and consolidated access through apple.nasa.gov.

  5. 5

    Tacit knowledge transfer was pursued through training integration, case-study learning, cognitive-bias awareness, video interviews, podcasts, and mentorship.

  6. 6

    NASA reinforced learning through institutional remembrance (day of remembrance) and artifact-based experiences (Columbia room and traveling Columbia artifacts tour).

  7. 7

    Knowledge capture was supported by formal project requirements to produce lessons learned after projects end, with technical writers available to help engineers document what they learned.

Highlights

“Technical authority and descending opinion” made speaking up a personal responsibility and enabled independent experts to decide when managers and teams couldn’t agree.
Lean Knowledge Management focused on the technical workforce and used a one-stop portal (apple.nasa.gov) to prevent lessons learned from being trapped behind center-by-center silos.
Podcasts and mentorship were used to bridge generations by turning tacit expertise into engaging, repeatable learning formats for younger employees.
NASA used remembrance rituals and artifact experiences to keep Apollo 1, Challenger, and Columbia emotionally and operationally present in everyday work.

Topics

Mentioned