Sunday, 4 August 2013

To Forgive Design

I've finished Petroski's "To Forgive Design", and very useful he was too in clarifying some ideas we share about the nature of design. To quote the book in summary of its content:
“It is imperative that the realistic prospect of failure be kept in the forefront of every engineer’s mind.”
The history of technology and engineering is littered with failures. When designers ignore this, and focus instead on past successes, they become complacent. Over around a thirty-year cycle they as a group forget the limitations of a novel design technique as its use becomes commonplace, and the technique is pushed beyond its limits, leading to failure.

We learn from our life and professional experience about the limitations of technology, and we retain the information as concepts, to be applied by analogy or metaphor.  A history of failures by means of specific and concrete examples is instructive to the designer, as most of the causes of failure in the real world do not exist in the mathematical world of "engineering science". These causes undercut the assumptions which the theoretician must make to allow mathematical analysis. They may be divided into hardware and software failures, but most contain aspects of both the known and unknown limits of designs and the uses to which they are put.  

Failures are mostly founded in phenomena which are known about, but not well understood - the known unknowns. Our lack of scientific understanding is reflected in the design codes and margins of safety we apply to an always incomplete mathematical model. Knowing the extent to which one is operating beyond one's knowledge should inform a decision about suitable safety margins, but complacency and pressure from management conspire to remove the margins of safety which should reflect the designer's lack of knowledge. Pressure from risk-tolerant management on risk-averse engineers is commonplace, as a recent TCE article by Mohan Karmarkar discusses.

The decision about whether and what to build is not made by engineers - cost, risk, economic, social, aesthetic and political considerations may dominate these decisions, and be fed back to designers as limitations on scope, budget, programme etc.  People get the design which they are willing to pay for.

Materials in the real world do not always meet specification, and testing protocols to reject non-compliant inputs may be compromised. Similarly, construction and commissioning is never carried out exactly as the designer intended, operation and maintenance "develops" from the approach given in the O+M manual, and the plant may well come to be used for purposes other than those for which it was designed.

In recent discussions I have had to do with plant design, a few ideas have come up. They remind me by analogy of other failures which I have witnessed in the wider world, as does the process of the growth of irrational exuberance until catastrophe ensues which Petroski describes.

These failures are referred to as "bubbles", or boom and bust cycles. Many of them match the cycle of "design bubbles" quite closely. The things which people say which tip me off as an investor that an asset class might be subject to a bubble are as follows: 

"You can't lose"- the idea of high, uninterrupted and irreversible growth in the value of anything ignores all that we have learned of the history of economic value. Things which have a high rate of growth in value are in general more risky than those with a low rate. Human nature means that high-growth assets can very rapidly become assets whose high rate of growth is based solely on a bandwagon effect. In the 17th Century, tulip bulbs were subject to this effect, an episode now referred to as "tulipmania", most famously described in Mackays "Extraordinary Popular Delusions..."

"The old rules don't apply" -  The dotcom crash came as no surprise to those who knew about other tech-stock crashes of the past, such as railway mania, and the stock market crash of 1929 (which owed much to bubbles in the prices of then-novel electrical, radio, automotive and aviation technology companies). The idea grows that these new technologies will mean that unlimited profits are available, and that consequently old models of pricing do not apply.  

But you can always lose in any game worth playing, and 21st century people are just people, whose natural inclinations are just as they always were. Greedy overconfidence is followed by disaster, followed in turn by the forgetting of the cause of disaster, and the substitution of wishful thinking for rational analysis which restarts the cycle. Unless we act in a way which does not come naturally to humans and remember the past, we repeat it.

Another idea which has come up in several forms is that design based on computer models so complex that their users cannot fully understand the provenance of their outputs is so much better than old methods that margins of safety can be cut to the bone, and management should not come back to operational staff to ask for debottlenecking and optimisation exercises.


Clever computer programmes devised by highly numerate engineering graduates turned finance wonks were in large part responsible for the depth of the most recent crash. Some of these programmes in fact attempted to apply versions of physical science and engineering formulae to financial calculations. Economics is not only not a science, it is irrational nonsense, founded in obviously false axioms of rational behaviour by humans as a mass. However, betting the world economy on substituting an equation from a completely unrelated field into an automated stock trading programme is less rational still.

As soon as people start telling me that they can use the shape of stock market price graphs to predict where they will go next, I smell cow. Similarly, anyone who hears claims that computers can do anything other than handle complex but essentially dumb tasks such as pinch analysis better than people, they should walk away. Computers are not creative, and engineering is a creative activity. Therefore computers cannot engineer.

The cleverest computer simulation in engineering is just a model based on applied maths and science which does not fully describe the thing being modelled, and the best software written has 4% errors. Having more faith in such models than the professional judgement of a group of experienced engineers is foolish.

Petroski notes a way in which the focus on the very recent past of professional researchers, and a move away from paper to electronic records made it difficult for him to fully explore the history of failure. Mohan Karmarkar's TCE article points out that histories of Chem Eng accidents on the internet seems to only go back 20 years, and the IChemE's Accident Database is now both out of date and hard to obtain. The only useful records of the limits of design seem to be in the minds of highly experienced designers, so they cannot be programmed into simulation programmes. 

I do however have sympathy with the idea that management should not come back to operational staff and ask for things on the plant to be tightened still further, not because a simulation-informed design is better, but because the idea that it is means that safety margins have already been cut in the design process.

Design starts as in the minds eye, as Ferguson notes, just as it has for thousands of years. That starting concept comes not from science or mathematics, but from the designers ingenuity, experience and ability to see analogies.

All that has changed to allow technology to progress is that the designer's mind's eye envisages concepts based on a more sophisticated state of the art, more powerful tools have been devised to allow us to winnow good options from bad, and smarter information storage devices have been produced to allow us to better learn from our past mistakes.

Engineering is in essence the same activity as it was before it was even known by that name. Unlike the professional researcher who needs to give only the most recent references, designers can go back to the ancients, as Petroski does to great effect.

No comments: