Even great minds fail sometimes, but we can revise our mental models
Though this blog is created by a company that makes hardware and software, the core of our work—and yours—is problem solving and optimization. In the past, I’ve said that success in this work demands creativity, judgement, and intuition. I’ve also written (several times) about my fascination with failures of intuition and ways we might understand and correct such failures.
One way to look at intuition in engineering is to see it as the result of mental models of a phenomenon or situation. Failures, then, can be corrected by finding the errors in these mental models.
We learn from those who go before us, and a great example of a failure (and its correction) by a famous engineer/scientist prompted me correct one of my own mental models. It was an error many of us share, and though I realized it was wrong a long time ago, it’s only now that I can clearly explain the defect.
The trailblazer is the famous rocket pioneer Robert Goddard. He was also a pioneer in our own field, more than a century ago inventing and patenting one of the first practical vacuum tubes for amplifying signals. If he hadn’t focused most of his energy on rockets, Goddard would probably be as famous in electronics as his contemporary Lee de Forest (but that’s a story for another day).
Goddard knew that stability would be a problem for his rockets, and he had no directional control system for them. To compensate, he used a cumbersome design that placed the combustion chamber and exhaust of his first liquid-fuel rocket at the top, with the heavy fuel and oxidizer tanks at the bottom. Having a rocket engine pointed at its own fuel is clearly inviting problems, but Goddard thought it was worth it.
He was mistaken, falling for what would later be described as the pendulum rocket fallacy. He was a brilliant engineer and experimenter, however, and corrected this erroneous mental model in rockets built just a couple of months later.
My own error in this area involved helicopters and their stability. Decades ago, I was surprised to learn that they are inherently very unstable. A friend—an engineer who flew model ’copters—gave a simplified explanation: The main mass of the helicopter may hang from the rotor disk, but when that disk tilts, it’s thrust vector tilts in the same direction. The lateral component of that vector causes more tilt and, unfortunately, also reduces the lift component. The process quickly runs away, and the helicopter seems to want to dive to the ground.
It’s similar to an inverted pendulum, and just the opposite of what my intuition would have predicted. It explains why pilots of non-stabilized helicopters describe the experience as balancing on the top of a pole: they must sense small accelerations and correct before motions build up.
While the explanation corrected and improved my mental model, my intuition was woefully unable to handle the claim that the aircraft below was relatively stable and easy to fly.
The Hiller “Pawnee” flying platform of the 1950s used ducted fans beneath the engine and payload/pilot. Despite its appearance, it is easier to fly than a helicopter. (public domain image via Wikipedia)
This aircraft certainly does look like an inverted pendulum, though that perception is actually another fallacious mental model.
One explanation comes from the powerful engineering analysis technique of imagining small increments of time and motion. If the flying platform tilts, the thrust vector of the fans tilts in the same direction, just as with the helicopter. However, in this instance the thrust is applied below the center of mass rather than above. The tendency to cause a tilt is applied in the opposite direction and is therefore not self-reinforcing.
I don’t believe the flying platform arrangement is inherently stable, but it is much less unstable than a helicopter. I once flew a flying platform simulator in a museum, and it was indeed straightforward to control.
So, let us acknowledge the power of our engineering intuition and salute our mental models as immensely useful guides. But let’s also remain vigilant in remembering that even the best engineers sometimes get things wrong, and it’s essential to be willing to make corrections.