Former Director of Mission Operations for NASA’s human spaceflight, Paul Sean Hill, offers the insider’s view with this excerpt from his new book, ‘Mission Control Management’.
From the day we start, we are really being trained to be trusted. Everything from the alignment to core purpose to learning to walk and talk like flight controllers is geared towards developing individuals and teams that can be trusted implicitly to:
- Sift through the complexity.
- Quickly evaluate the situation and options.
- Remember our training and the ‘corporate experience bank’.
- Overcome the intimidation and fear.
- Then be perfect in decision-making and action, which includes realising when the procedures and our previous experience are misleading us.
It is a morality
Together, these comprise the real-time morality: unyielding alignment to purpose and deliberately applying the Mission Control trust elements of technical truth, integrity and courage in all decision-making.
Mission Control trust elements and performance expectations
Make no mistake; in the control room it is a morality. Real people trust us with their lives in these spacecraft. That comes with a very real sense that we would also be failing their families and the larger NASA community who are outside the Mission Control Room, all of whom are also trusting us.
The way we train is all about unyielding alignment to purpose and deliberately applying the Mission Control trust elements of technical truth, integrity and courage in all decision-making.
While intimidating, there is power in the emotional charge that comes with that awareness. It strengthens our alignment to our core purpose and helps put the steel in the ‘steely-eyed missile men and women’ who have to make the calls. This is that internal spark we get from a series of realisations:
- This thing we do matters.
- What I do as part of it matters.
- What I do can have the ultimate consequences for someone.
- I can make a difference.
- I must make a difference.
Chris Kraft referred to this in his book ‘Flight’, when he described the mood after the Apollo 1 fire that took the lives of astronauts Gus Grissom, Ed White and Roger Chaffee. “This determination to make sure these men did not die without cause, I believe, gave us all the strength to continue our job of landing men on the moon. It also brought us all closer together and made our responsibilities crystal clear.”
Even with that clarity, however, the need to do something doesn’t justify doing just anything. Our goal is deliberate decision-making and actions aligned to our core purpose, instead of panicking, giving up or guessing. Without a full understanding of a situation and our actions, we may get lucky and make it through, but we wouldn’t consider the performance a success.
We’d still be guilty of not deliberately managing the risk. We are guilty of not basing our decision-making on technical truth; thus, we are guilty of breaching the real-time morality.
Click here to take the TJ survey and get three months free digital subscription to TJ plus the chance to win an Amazon Echo
Besides the big, obvious risks we manage, like not blowing up the rocket, we have to manage the more subtle risk that is introduced through bias, subjectivity and errors that creep in, even unintentionally, from [a] wide range of ‘human element’ sources and then clouds our understanding and judgement.
Once accepted, these biases and errors can go unchallenged despite new data or study that otherwise could have led the team towards truth.
Further, once accepted by ‘experts’, heroes and respected leaders, these biases and errors become much more difficult to overcome. The world’s best rocket scientists and most sophisticated analytics can’t protect Mission Control from errors stemming from individual biases, preconceived notions, heartfelt but unexplainable certainty, or any other answer than the truth.
The real-time morality helps us there too, as it challenges the team and compels them to continuously ask the ‘whys’ and be on guard for the integrity in their thinking:
- Seek alternative perspectives, question your own objectivity and remain suspicious, if not downright paranoid, of biases – intentional or not – yours and the team’s.
- Why do you agree? Why not?
- What if we were wrong before? What if we are wrong now?
- Are you sure you have enough information to support your conclusions?
- Are you open to new information that changes your previous understanding?
- What part of our decision-making is based on data or defensible rationale, and how much is unchallenged for one of the reasons above?
- What additional information or discussion could help reduce the uncertainty?
Fly the way you train
It is the discipline to always do this that separates great rocket scientists, flight controllers and leaders in Mission Control from the rest.
The mantra in Mission Control is, “Train the way you fly. Fly the way you train.” Absolutely right, in all things, big and small, whether we are fighting to prevent an explosion, responding to less critical failures or replanning the mission, we remember and apply what we’ve learned.
This can become habitual, and it does for those of us who grow up in Mission Control.
The way we train is all about unyielding alignment to purpose and deliberately applying the Mission Control trust elements of technical truth, integrity and courage in all decision-making. That is how we expect our people to fly.
That is the real-time morality. It is the key to Mission Control’s legacy of highly reliable, real-time decision-making.
About the author
Paul Sean Hill is an author and former director of mission operations for NASA’s human spaceflight. Mission Control Management is published by Nicholas Brealey Publishing.