Worshiping the wrong heroes

May 5, 2011 - Reading time: 3 minutes

Charles Bolden put out a statement today on the 50th anniversary of American human spaceflight.  It begins (emphasis added):

[...]

May 5, 1961 was a good day. When Alan Shepard launched toward the stars that day, no American had ever done so, and the world waited on pins and needles praying for a good outcome. The flight was a great success, and on the strength of Shepard's accomplishment, NASA built the leadership role in human spaceflight that we have held ever since.

I was a teenager at the time and just sorting out the field of study I wanted to pursue. Though I never dared dream it growing up in segregated South Carolina, I was proud to follow in Alan's footsteps several years later and become a test pilot myself. The experiences I've had would not have been possible without Alan's pioneering efforts. The inspiration that has created generations of leaders to enlarge our understanding of our universe and to strive toward the highest in human potential was sparked by those early achievements of our space program. They began with Freedom 7 and a daring test pilot who flew the ultimate experimental vehicle that May day 50 years ago.

Giving astronauts full credit for the accomplishments of NASA's human spaceflight program is nothing new.  Many people (including people who work at NASA, and should really know better) view astronauts as a superhuman species, whose wisdom, wit, talent, and general prowess are the foundation of NASA's accomplishments.  I'm fairly certain that most astronauts, at one level or another, believe this too.  This notion has led to the corruption of the (already slanted) phrase "no bucks, no Buck Rogers" to the (even more slanted) phrase "no Buck Rogers, no bucks" - implying that without hugely egotistical military aviators as spokesmen, NASA has no hope of funding its programs.  Wonderful.

And that's why it's not surprising to hear someone give such wide-ranging credit to Alan Shepard, who wasn't an engineer or a scientist, for NASA's first manned suborbital flight.

But it does hurt a bit when that someone is NASA's administrator - even if he was also an astronaut. I've mentioned in the past, and will surely bring up again in the future, the roles I think astronauts and engineers play (and should play) at NASA.  At a time when we're trying to find ways to encourage more students to pursue STEM careers, leaders at all levels do themselves (and the rest of us) a disservice by failing to address the fact that a STEM career will not bring you any glory (or even much recognition for good work), will not make you rich (or even moderately wealthy), and will not make yours a household name (face it; the odds are really against that one).  By giving such fawning attention to Alan Shepard, Bolden minimizes the real, profound, backbreaking efforts of thousands of scientists, engineers, and technicians who actually made the flight possible.

So on this, the 50th anniversary of NASA's first manned suborbital flight, let's also recognize the people who actually made it possible: the thousands of smart people who worked long hours under stressful conditions to send some test pilot into space, and (probably against their better judgement) bring him back safely.  It was your pioneering efforts that inspired many of us to pursue engineering and follow in your footsteps.  You may not have inspired Charlie Bolden, but you did inspire me.  I hope that counts for something.


The Wrong Stuff

February 27, 2011 - Reading time: 3 minutes

I just saw this article discussing the challenges of returning from Mars.

Returning from Mars is a large engineering problem that's independent from most (not all) aspects of getting there, and it's a good idea to start pondering the problem even though a specific mission architecture hasn't been agreed on.

What's sad is that ATK, Lockheed, and Grumman were NASA's choices.  This is the root of many of NASA's problems: an inability to divorce itself from the lumbering herbivores that have grown (over the last 5 decades) to define the agency.  Corporate behemoths like the ones named here (and several others) are where good ideas and creative thinkers go to die.  Tiny organizations have great ideas and accomplish amazing things with limited resources - this is the essence of engineering, and that type of thinking and motivation is what got us to the moon originally (yes, I realize that it was the same contractors back then.  That was 40 years ago).  Now, NASA meets its small business goals by having large contractors subcontract work to small companies. 

How does this manifest itself?  A couple of examples:

  1. I work on a NASA contract that is run by a very large contractor.  When said large contractor bid on the contract, they made small businesses an integral part of the contract by getting half a dozen small companies to be "teammates."  These teammates don't contribute their small corporate culture to the contract; in some cases I would say that their corporate culture has been killed by the relationship.  Instead, new employees are assigned to either the large contractor or one of the smaller ones when they are hired.  The only difference between a project engineer employed by contractor A or B is what company writes the paycheck, and a couple of layers of management.  Every day, I have to complete 2 timesheets - one for the prime contractor, and one for my teammate.  If I want to take a day off, I have to alert my project manager and section manager (prime contractor), local manager and teammate principal (my teammate), and my NASA customer.  Yes, 5 bosses.
  2. If I want to buy a widget from a large company, I submit an order to the purchasing people.  They see that the order is to a large company, so they instead submit the order to a small/disadvantaged/minority or woman-owned company that exists solely to place my order, add a markup, and then sell it to my contractor.  I get the part I asked for, but a couple of days later and 15% more expensive.

This is how the large contractors think, and they are consistently rewarded for these inefficiencies and the hundred other examples of syphilitic idiocy they dream up each day.  One lesson presented by the Orion program (I no longer refer to "lessons learned" because it's clear that we learn nothing) is that the large contractors preserve their profit margins by re-using as much of their aging junk as possible, because innovation takes time and energy.  Thus the "high-tech" glass cockpit of Orion was going to use LCD displays that were obsolete before PDR, because that's what the large contractor tasked with making them had lying around.

I'm sure that given enough time and money, this set of contractors could make a functional crew return vehicle for a Mars mission.  But what are the odds that they can make an excellent crew return vehicle?


The mother of invention

February 10, 2004 - Reading time: 4 minutes

You've probably heard the saying "Necessity is the mother of invention" a few times.  The phrase implies that necessity springs out of the blue, and that civilization ceases to function until whatever sudden pressing need has been satisfied.  At best, the phrase is a tautology.  At worst, it is an indication that the speaker of the phrase is the sort of simple-minded fool that spouts trite expressions without giving thought to reality.


If you were to step back and take stock of your surroundings right now, you would be hard pressed to find anything that you need that isn't somehow provided for.  You aren't special -- it's like that for everybody.  And everybody now isn't special, either: people who lived in the 1800s had everything they needed as well; so did the people living in prehistoric times.  The technology and other "things" that exist in a given time define that era; that is to say that our tools are, by definition, adequate for living in our world, just as the tools of cavemen were adequate for living in prehistoric times.  Citizens of the world didn't wake up one morning in 1923 and realize that nobody could go on living until the television was invented; they had other perfectly adequate means (such as radio or print) to distribute news and entertainment.  The Wright brothers didn't realize, 100 years ago, that civilization would collapse if they weren't able to get an airplane to fly.  And, heartless as it may seem, if nobody stumbled across Penicillin, there might be fewer of us around, but we'd still be here.


Some people choose to invert the phrase; they say "invention is the mother of necessity."  It's a tempting thought -- first we invent the horse-drawn carriage, then we invent the automobile to replace the carriage, but to make the automobile more palatable, we must invent power steering, cruise control, leather seats, huge stereo systems, radar-assisted parking systems, etc.  A more interesting example is that of the tin can.  One would suspect that the invention of the tin can would necessitate the immediate invention of the can opener.  But while the tin can was first presented in 1810, the first useful can opener didn't appear until nearly 50 years later.  In reaction to this, the mind's eye conjures up amusing images of an entire generation of hungry Victorians starving and contemplating the bitter irony of life as they stared at shelves full of canned foods; but fortunately it wasn't so.  Instead, people looked at their surroundings and used what they had.  For example, a tin containing roast veal carried on the explorer William Edward Parry's Arctic expedition in 1824 included the following instructions for opening: "Cut round on the top with a chisel and hammer."  Soldiers fighting in the American Civil War opened their canned rations with knives, bayonets, and even rifle fire.  The earliest purpose-built can openers were cumbersome, complicated gadgets that were owned by shopkeepers, which was unfortunate because opening your cans at the checkout register defeats the purpose of having the stuff canned in the first place.  William Underwood, who established America's first cannery in the 1920s, advised his customers to use whatever tools were around the house to open the cans.


As Thomas Edison wrote, "Restlessness is discontent -- and discontent is the first necessity of progress."  Surely, inconvenience breeds restlessness, and it's not too hard to see that there was no convenient method for most people to open tin cans; this inconvenience was what got Ezra Warner of Waterbury, CT thinking in her spare time, and eventually led to her landmark 1858 patent for a can opener that just about anybody could use.  It worked well enough, but its use left cans with sharp, jagged edges.  Although a nasty cut to the finger is most often not fatal, it can be inconvenient, and in 1870, because of this, William Lyman of West Meriden, CT, patented the first can opener to use a wheel-shaped blade which made a smooth, continuous edge.


The story goes on, but perhaps you see the point I'm getting at -- necessity is not the mother of invention, and invention is not the mother of necessity.  Inconvenience is the mother of invention; necessity is already provided for,
or else we wouldn't be here.  I make the (bold? foolish?) claim that nothing that has ever been invented has been necessary; new items are only invented to improve upon the perceived shortcomings of existing items.  Don't believe me?  Look around your desk; pick up anything, and think: what need went unfulfilled before this thing was invented?  What would people have ever done without it?  I assure you, nothing man-made predates man, so somewhere along the line, someone got along without anything that we've invented so far.  They may not have liked getting along without it, but that's why it's here today -- because it just makes life so much more convenient.


They just don't make them like they used to

October 28, 2003 - Reading time: 5 minutes

In the past couple of years, national security has been on everyone's mind; laws have been passed, rules have been enacted, and generally life has been made more miserable so that we as a country can feel more secure.  Some of the initiatives that we have seen are very visible; airport security, security at federal buildings, and legislation such as the Patriot Act have been widely discussed, and their relative merits are subject to some debate.  There has also been much behind-the-scenes work, such as the Container Security Initiative (CSI), which is designed to protect the transportation of the ubiquitous and increasingly important 40-foot containers that bring us much of what we buy.  All discussion of the merits of these security precautions aside, we can still say that people are actively working to keep our critical infrastructure safe from attack.  But we have been ignoring an important point in the process of securing our national infrastructure, and that overlooked point presented itself to us recently.  The recent massive power outage in the northeast provided us an important lesson: decreasing margins of safety and error in our infrastructure place critical societal functions at greater risk of significant disruptions from rare occurrences -- accidental, malicious, or otherwise unforeseen.  This is nothing new; it has been going on for decades now, as a series of decisions by policy makers placed the administration of our national infrastructure in the hands of profit-seeking organizations.  This is not necessarily bad, but redefining acceptable levels of risk and protections as the world changes is hard work, and needs to be done carefully.


Cost pressures and tight engineering under benign assumptions over the last few decades have lead to thin margins of error in our current infrastructure.  This is to say that certain major failures are assumed to be so unlikely that they are discounted during the design process.  This way of thinking creates systems that tend to be less expensive, and are optimized to fit the relatively optimistic world and set of basic assumptions.  But while optimized engineering leads to most events being of small consequence (because the systems are engineered to tolerate them), some rare events that might otherwise have been relatively benign (or at least tolerable) can now lead to massive disruption.  As the margins of safety designed into the large, complex, and poorly understood systems that make up our critical infrastructure (such as the national power grid) are whittled away in the name of cost-effectiveness, the likelihood of massive, uncontrolled failures increases.  But while it seems like this might be just asking for trouble, it is seen as "bad engineering" to overdesign a system to tolerate very rare events, or events whose specific causes are not well understood, if that tolerance is perceived to cost more than the failures it would prevent (in terms of expected value to the customer), or if the likelihood of the failure seems very remote -- fragility to extremely rare events is seen as a good business decision.  This is why rare disruptions (like power outages) come as little surprise to insiders of highly optimized or complex  infrastructures.  Building excess capacity and redundancy into a system such as the electric power grid is essential to safety and reliability, but it has no market incentive -- safety doesn't sell.


What the market calls "excess capacity" (note the connotations of "excess"), others call a safety net.  When a critical power line fails, parallel lines must have this "excess" capacity to take over the flow, and this safety net must remain intact when lines are out of service for maintenance.  Such safety is not cheap.  So while adequate margins of safety generally have the side effect of increasing the overall efficiency and reliability of a system, at some point investments in redundancy are seen as extravagant and wasteful to stakeholders, whether they are private stakeholders (i.e. shareholders) or public (i.e. taxpayers).  Those who are out to placate stakeholders tend to favor more visible single-point safety or security measures, which tend to cost more in the long run and are generally less effective.


The invisible hand of economics creates systems designed and optimized under optimistic assumptions of relatively benign environments; these systems are at great risk if new or unexpected threats arise, because the margins that have historically made it possible to work around unexpected problems (think of the Apollo-13 near-disaster) are no longer designed in.  The development of our critical infrastructure is subject to these economic motivations, so it is already (and will become more) fragile to rare or unexpected events.  That's good business paving the road to future vulnerabilities, because the market will not bear the cost of the level of reliability that it expects.  The pace of technological change and societal reliance on these systems amplify the uncertainty, urgency, and magnitude of risk here.


After 9/11, we can point out how scenarios that were previously almost unthinkable are suddenly possible, and thus engineered defenses against potential attacks are more strongly motivated.  However, to define and quantify threats and their impact, particularly in combination with coordinated physical and psychological attacks and effects, requires deep contemplative research, development, large-scale experimentation, and the like -- all very costly with little to no visible immediate payoff (which makes them politically unpopular).  But given the social and economic consequences that arose from the recent power outage, the national power grid is suddenly a large, inviting target for those who seek to disrupt society because it has demonstrated weaknesses and widespread impact.  It is impossible to protect all important points of such a large system using the standard paradigms of physical security, which is generally designed in isolation from the system it is protecting, and therefore offers little real protection.  Instead we need to fix the basic problems with the infrastructure -- if we can reduce the potential impact of catastrophic events on the power grid by making it more robust and flexible, it will become a less inviting target for catastrophic terrorism.  To achieve this, we must accept that we need non-market investments in the design and implementation of safety, security, and robustness in critical infrastructure.