Nexus of Evil

There are a small group of individuals who spend their days and nights trying to find the weak points in NASA’s human spaceflight program.  This diabolical and insidious team has penetrated the most secure sectors of the space agency and they gather their information from the inside.  I personally have confronted this organization and can attest to their manipulative and devious behavior. 


My fellow Flight Directors have termed them “the dirty slime ball [expletive deleted]s.”  Their name:  the integrated training team.  Their leader:  Sim sup.


Training and simulations have been an integral part of America’s human spaceflight program from the very beginning.  We like to say that we train like we fly, and it is pretty close in many ways.  The job of the training team is to ensure that the astronauts and the flight controllers are prepared for any eventuality.  Not only if things go as planned, but what to do if something goes wrong.  The trainer’s relish their role.


When the astronauts are in the simulator, it is as close to the space flight experience as we can make it.   Microgravity can’t be replicated, but almost everything else can be.  When the flight control team is in the Mission Control Center, the data coming in looks just the same whether it is coming from a real life shuttle (or ISS) or from a simulator.


The sim team is lead by the Sim Sup (pronounced like “soup”) which stands for Simulation Supervisor.  In the ISS world, they have adopted the moniker STL for Station Training Lead, but the job is the same.  The Sim Sup and his team of trainers think very hard about lessons that the astronauts and flight control team needs to learn.  A lot of these are cataloged and are de rigueur.  Leaks, circuit breaker pops, engines that quit, radios and other electronic gear that flakes out; all of these and many more are standard issue failure scenarios.  A moderately well trained team should be able to handle any single failure without breaking a sweat.  The sim team looks for the optimum combination of problems that lead the flight team to the edge of failure.


No kobayashi maru scenarios, though.   Mission operations management stands by the credo “Failure is not an option.”  There is always a way out.  Kirk would be proud.


That doesn’t mean the scenarios aren’t tough, however.  During one memorable shuttle launch simulation, I counted 47 different malfunctions that the simulation team inserted into the run in the space of 10 minutes.  When I asked sim sup what was the point of that run, he replied:  “Flight, just wanted the team to learn to prioritize between problems that could kill ya now and stuff that could wait until later.”  Thanks a lot sim sup. 


More often than not, the cases were highly cerebral, and it frequently seemed like playing an elaborate chess game with the sim team. 


Whatever the flight plan and the objectives, Sim Sup was certain to put together a scenario that would make the team question their assumptions and plans.  That was the point; not just failure response, but is the plan a good one.


There is a long history of simulations causing the team to build a better plan that in fact saves the day.  The last landing simulations before the flight of Apollo 11 inserted some LM computer failures which caused the team to abort the landing.  The DPS officer went back to the office determined to avoid that outcome.  When the real LM computer started spitting out alarm codes during the real first lunar landing, DPS was prepared. 


Similarly, during an Apollo 12 simulation, the training case required the LM to be used as a lifeboat for a crippled CSM.  This lead to a series of studies and plans about how to improve that capability.  Those plans became the center of the Apollo 13 response.


I learned early on never to tell Sim Sup that his case was non-credible.  Every time I complained about some failure scenario, sure enough something like it would come close on the next shuttle flight.  But we were ready.


And not all cases were introduced through the computer models running over in the simulator building.  Once Sim Sup snuck out to the MCC and handed the EECOM a note “you are having a heart attack.”  The resulting theatrics by the EECOM and his next door neighbor EGIL caused another flight controller on the other side of the room to call 911.  The EMTs were not amused to find out that they had been scrambled out of the fire station due to a simulation.  MOD management said no more simulated heart attacks in the MCC.


Another flight was preparing for an October launch shortly before a Presidential election.  The Sim team called the Flight Director and told him that a candidate was at a campaign stop and wanted to talk with the crew.  That caused a flurry.  But wait; a month or so later, during the actual flight, just a couple of weeks before the election, the phone rang and, guess what?  A certain candidate wanted to talk to the crew while he was at a campaign stop! 


There must be a million stories about the complex interlocking training cases that the sim team inflicted on the flight team.  But the key remains that assumptions were questioned, better plans were made, and the team was better prepared for real spaceflight and the problems that Murphy would throw our way.


I’ve been reading a lot recently about the financial meltdown and the “quants” that had become so influential in business circles.  They could not believe that their computer models of the financial industry were flawed.  But they were.  I wonder if the financial sector could benefit from Sim Sup? 


How valuable would it be to have a Sim Sup for life decisions?  Somebody who could string out the scenario so we got to see how our choices play out.  We could all use that on a personal level; maybe we could use that on a national level.