NASA’s Communication Disaster

 Every January, NASA has a day of remembrance for all those who have died in the advancement of the space program. Having wandered through the website with an eye to communication, I was impressed. It sets up layers of approach. If you only skim the front page, you’ll immediately be given the 3-minute YouTube video, which is perfect if you just happened onto the site and want an easy and short way to get an overview of what the Day of Remembrance is. The front page is then mostly taken up with separate blocks that give brief introductions to each of the three major tragedies where astronauts died.

From a communications perspective, the website communicates well. It has a simple and clear message, which is “remember the lives lost in the service of space exploration” and it repeats this message in different ways: through video, text, and images. There is more on the website, much more, but you have to encounter that first message on that first page before the website allows you to click away from it to get details.

This is a professionally designed website with professional writers for the moving tributes. It is highly unlikely that NASA engineers were in charge of designing or writing this communication. And why should they be in charge? This is not their expertise. And yet, in the day-to-day work of NASA engineers, we expect them to be great communicators although it has been shown again and again that they are not trained in and consistently lack these skills.

I will be looking at the Columbia disaster from 2003 when the space shuttle disintegrated over the United States killing all seven astronauts on board. I’m choosing this disaster because the technical problem that killed them was known almost immediately after takeoff but NASA chose to do nothing about it or and they chose to not even tell the crew about its lethal potential.

Over the next five episodes, I will cover several topics related to humans dealing with humans on the smaller scale as well as the larger, existential scale. I will look at how people are trained in different mindsets—the bureaucratic, the engineering, and the humanities—which train us to look at the world and problems differently. I will also look at how and why we force engineers to pretend to have communication skills even though investigations into two space shuttle tragedies have highlighted the lack of these skills as contributing factors to the deaths of fourteen astronauts.

In considering this I will explore the antipathy that engineers as a whole feel toward non-engineers and those who have a non-engineering mindset. And in the final episode, I will consider how dismissing the usefulness of a humanities mindset holds space exploration and companies like NASA back because they do not consider the human as a part of their design and push away the possibility of unpredictability and have no skills to encounter situations where doubt cannot be eliminated and death is a likely, anticipatable outcome.

You Can’t Fix the Human

In his book, Black Box Thinking, Matthew Syed looks at how embracing failure can lead to innovative advancements. He has high praise for the airline industry for learning from its mistakes, but the price that industry has paid for its knowledge is high. Captain Sullenberger, who is famous for, in 2009, safely landing a commercial airplane on the Hudson River in New York after both engines were lost during a bird strike, put it this way:  

“In aviation, every rule that we have, everything we know, every procedure we use, we know because someone, or often many people, died. We have learned important safety lessons purchased at great cost, sometimes literally bought in blood. We have an obligation not to forget these lessons and have to relearn them.” (p. 44)

The NASA Day of Remembrance website also has that feeling of wanting to learn from mistakes and not forget them. It is very upfront about the deaths, and if you click past the front page, past the quotations of sympathy from former US presidents and all the glossy photos, you will find pages and pages of easy-to-access detailed information that arose out of the investigations after each tragedy. This confirms a goal to investigate deeply and learn from mistakes, and all of this is done in a transparent and open manner.

But.

Sullenberger and—I would argue—NASA have a mindset of fixing mistakes, with mistakes understood to be tangible, static things, that can be corrected and thus not repeated in the future. They are not thinking about dealing with humans.

Let me explain what I mean here.

There are three tragedies that NASA’s Day of Remembrance website focuses on. In 1967, three astronauts died on the Apollo 1 when a fire broke out in their capsule during a test and they couldn’t get the door off to escape. In 1986, all seven astronauts on the Space Shuttle Challenger died in an explosion at takeoff because of a problem with the O-rings. In 2003, all seven astronauts on the Space Shuttle Columbia died 16 minutes from their return landing when the shuttle disintegrated because of a hole in the wing.

In all these cases, investigations found the technical errors, of course, but they also pointed out failures in the culture and communication systems of NASA as contributing to the technological failures. In other words, yes, there was a technical problem—the door seals, the O-rings, the ceramic tiles—and these were fixed and never problems again, but every twenty years failures in the culture and the way NASA communicates were repeated and more people died.

NASA would never allow technical problems to simply happen again, so what is the difference in these cultural and communication issues? Why is NASA able to fix technical issues permanently, but is not able to fix the human? I think that it is because NASA keeps trying to fix these human issues as if they are engineering or technical issues. But humans are squishy and unpredictable; they cannot be calculated out to the third decimal place, and NASA simply doesn’t have or prioritize the mindset that would be able to deal with humans as they really are.

That is a very brief introduction into what is to come. Preparing this series was both challenging and enlightening. It started out as a short piece on a single slide used in a PowerPoint presentation during the Columbia disaster, and it blossomed into a multi-part series that covers a range of topics and takes an entirely new approach to technical bureaucracies and how we understand them and navigate them.

Join me on Patreon for future episodes of the Space Shuttle Columbia as a Case Study for Communication Disasters and look at the timeline of what was said and what was done in the two weeks between the discovery of the issue that caused the damage to the shuttle and its disintegration upon reentry.

The Space Shuttle Columbia: An Engineering Communication Disaster

Episode 1. Introduction

Episode 2. A Case Study for Communication Disasters

Episode 3. We Train Engineers and Bureaucrats to Think That Way

Episode 4. Communication Bias Is Always There If You Look

Episode 5. Dealing with Doubt

Episode 6. Dealing with Death