How Knowledge Management Helped Change NASA’s Culture | APQC Webinar
Based on APQC's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
NASA’s cultural shift relied on pairing knowledge management with safety governance changes, especially “technical authority and descending opinion.”
Briefing
NASA’s culture shifted because knowledge management was built into how decisions get made after catastrophic failures—turning “lessons learned” into practical, searchable, teachable knowledge and giving engineers authority to speak up on safety-critical issues. Roger Forsgren, NASA’s chief knowledge officer from 2015 to 2021, frames knowledge management not as a culture changer by itself, but as the tool that helped NASA operationalize the cultural changes demanded by tragedy, oversight, and an aging technical workforce.
Forsgren begins with the scale and urgency: NASA employs about 18,000 civil servants with an average age of 46, supported by roughly 60,000 contractors. With tacit expertise concentrated in experienced employees, knowledge transfer becomes a risk-management problem, not just an efficiency project. He contrasts NASA’s earlier Apollo-era workforce age (about 28) with today’s younger averages at companies like SpaceX (about 29), underscoring why knowledge capture and transfer matter now.
The webinar then ties NASA’s knowledge management push to a history of both successes and failures. High-profile achievements—like the International Space Station, the James Webb Space Telescope, and the Space Launch System—sit alongside painful lessons from Apollo 1, Hubble’s mirror error, and Mars mission failures tied to unit and technical mistakes. But two accidents, Challenger (1986) and Columbia (2003), are presented as the decisive cultural inflection points. Challenger’s launch proceeded despite engineer objections about unusually cold temperatures and the failure risk of O-rings; Columbia’s disaster followed repeated foam-strike warnings that managers did not act on.
After Columbia, NASA introduced structural changes that Forsgren calls “technical authority and descending opinion.” The model gives every team member a personal responsibility to speak up when health and safety are at stake, and requires managers to listen. If consensus fails, independent subject matter experts can make the technical call—paid and positioned to be objective—rather than deferring to hierarchy. This shift is paired with governance pressure: the Aerospace Safety Advisory Panel, appointed by Congress, insisted NASA appoint a chief knowledge officer and addressed a core gap—NASA was good at gathering knowledge but weak at disseminating it.
Forsgren describes “lean Knowledge Management” as the practical implementation. NASA focused on the technical workforce, kept the program simple (avoiding academic taxonomy), and consolidated information into a single, open website: apple.nasa.gov. Instead of scattering lessons learned across many centers and servers, the one-stop portal supports searching for case studies, courses, and training.
To move tacit knowledge—expertise that lives in people’s heads—NASA fused knowledge management with training and created knowledge events designed to resonate with engineers. Case studies from Apollo 1, Challenger, and Columbia anchor coursework, while modules address critical thinking and cognitive bias in engineering decision-making. NASA also used video interviews with practitioners, podcasts (often 15–20 minutes every two weeks) to reach younger listeners, and leadership/mentorship programs that pair high-potential engineers with senior decision-makers.
Finally, Forsgren emphasizes cultural reinforcement mechanisms: a “day of remembrance” where employees revisit Apollo 1, Challenger, and Columbia, plus physical and traveling memorial-style artifacts (like the Columbia room at Kennedy Space Center and a Columbia artifacts tour for centers that can’t visit). The result, he says, is a more transparent environment where failures are treated as learning opportunities—and where engineers increasingly feel empowered to raise safety concerns and share what they learn before it becomes another preventable tragedy.
Cornell Notes
NASA’s knowledge management effort helped change culture by embedding learning into decision-making and safety governance after Challenger and Columbia. The key shift was “technical authority and descending opinion,” which requires managers to listen to safety concerns and allows independent experts to make technical calls when agreement fails. Lean Knowledge Management then operationalized that cultural goal: it targeted the technical workforce, simplified knowledge practices, and consolidated access through apple.nasa.gov. To transfer tacit expertise, NASA paired knowledge management with training, case-study-based coursework, cognitive-bias awareness, video interviews, podcasts, and mentorship programs. The approach mattered because NASA’s aging workforce and contractor-heavy operations made knowledge transfer urgent, and failures had shown that hierarchy without speaking-up can be deadly.
Why did NASA treat knowledge transfer as a cultural and safety issue rather than a documentation project?
What did “technical authority and descending opinion” change in day-to-day project behavior?
How did NASA decide what knowledge to prioritize under “lean Knowledge Management”?
How did NASA handle tacit knowledge differently from explicit knowledge?
Why were podcasts and mentorship emphasized as tools for bridging generations?
What cultural reinforcement mechanisms helped keep Apollo 1, Challenger, and Columbia from fading into “history”?
Review Questions
- How does “technical authority and descending opinion” alter the balance between hierarchy and safety concerns on NASA projects?
- What steps did NASA take to make knowledge accessible and usable for the technical workforce (including the role of apple.nasa.gov)?
- Which mechanisms were designed specifically to transfer tacit knowledge, and how do they differ from lessons learned databases?
Key Points
- 1
NASA’s cultural shift relied on pairing knowledge management with safety governance changes, especially “technical authority and descending opinion.”
- 2
Workforce demographics—an aging civil service and large contractor base—made tacit knowledge transfer urgent.
- 3
Challenger and Columbia exposed not only technical failures but decision-making failures where warnings and objections were not acted on.
- 4
Lean Knowledge Management prioritized the technical workforce, simplified knowledge practices, and consolidated access through apple.nasa.gov.
- 5
Tacit knowledge transfer was pursued through training integration, case-study learning, cognitive-bias awareness, video interviews, podcasts, and mentorship.
- 6
NASA reinforced learning through institutional remembrance (day of remembrance) and artifact-based experiences (Columbia room and traveling Columbia artifacts tour).
- 7
Knowledge capture was supported by formal project requirements to produce lessons learned after projects end, with technical writers available to help engineers document what they learned.