Confused about hunches
Phil Swann, Si Managing Director writes
The general election has left me confused. Confused about what the results tell us about the state of British politics, for sure. But also confused about the implications for what weight to place on different types of evidence in decision making; in particular how to balance instinct with hard evidence.
In the months, weeks and days leading up to the election the opinion polls told a consistent story: the election would be very close, with the possibility of either David Cameron or Ed Miliband leading some form of coalition or minority or administration. Many of us – politicians, journalists, policy makers, consultants – acted on that basis.
Yet most of us also knew – on the basis of instinct or hunch – that there was no way that the electorate would put Ed Miliband in 10 Downing Street. In our guts we knew that wasn’t going to happen. Most of us, however, placed greater weight on the quantitative evidence, the polls.
If the post election reports are to be believed one of the people who bucked that trend was the Conservatives election strategist Lynton Crosby who believed his own private polling and his instincts.
This all reinforces my prejudice that gut instincts, feelings and hunches are all important bits of evidence that need to be taken into account in informing decision-making, alongside more methodologically robust qualitative and quantitative data.
But confusion begins to seep in when I recall what I think is a really instructive, but tragic case study in decision-making. It concerns the often-cited case of the Eastern Airlines flight 401 plane crash in the USA in the 1970s. The plane was on its final approach when a warning light in the cockpit alerted the crew to the fact that one of the landing wheels was not locked in the ‘down’ position.
It is perfectly possible to land a jet with one wheel out of action. It is a situation that pilots are trained for. But in this case the pilot and co-pilot did not believe the warning light. Their instincts told them that the light was faulty and the landing gear had operated normally.
The pilot and co-pilot spent valuable minutes fiddling with the light, hoping they could tackle the malfunction and thus substantiate their hunch. By the time they abandoned this approach it was too late, as while all their attention had been focused on the cockpit light, the crew had failed to notice that the aircraft was continuing it’s descent. The plane crashed and 101 lives were tragically lost.
I understand that a similar series of events – hunches overriding more technologically robust forms of evidence – meant that the impact of the partial nuclear meltdown at Three Mile Island in 1979 was far more severe than would otherwise have been the case.
Why do I concluded from all this?
First, as a consultant I will always work with my hunches and gut instincts alongside other forms of evidence. That is, in part, what my clients are paying for.
Second, I will continue to encourage my clients to treat their hunches seriously too. But I will do so in a way which meets three tests. Hunches should sit alongside other evidence rather than over-ride it. Hunches should be used to challenge rather than reinforce group think. And hunches should not be used as a form of procrastination or a defence against taking difficult decisions.
I have a hunch that is about right.