Friday, December 16, 2011

Lessons in cognitive bias from the 'day of infamy'

LAST week, the United States commemorated the 70th anniversary of the Japanese attacks on Pearl Harbor on Dec 7, 1941.

By William Choong, Senior Writer

To many outsiders, the country-wide commemorations so long after the event, and an expression of 'deep emotion' by Japan's Foreign Minister, might seem a tad overdone.

On a working visit to the Pearl Harbor last year, however, I had a first-hand experience of the significance of the event. At Hickam airbase, bullet holes made by Japanese bombers were not patched up at the headquarters building - an apt reminder that the passage of time has not fully healed the emotional scars left by the attacks. The hulk of the battleship USS Arizona, which was sunk with 1,177 sailors on board, still lies at the bottom of the harbour and leaks 2.2 litres of oil daily.

The so-called 'day of infamy' is significant because of two ironies. For one thing, senior US officials knew about Japanese intentions to attack, thanks to a code-breaking system called Magic. But they refused to accept the data, since arousing American anger would have been suicidal (it was). Second, the Japanese knew that attacking Pearl Harbor would only buy time before the US retaliated in force.

The burden of this irony fell on Admiral Isoroku Yamamoto. Typically portrayed in the US as a yellow-skinned warmonger, the Harvard-trained admiral warned his government against fighting the US. He admitted that the Pearl Harbor operation was 'conceived in desperation'.

Through history, there have been many other nameless and faceless Yamamotos who had to contend with the follies of their political superiors. This either led to major intelligence failures or financial crises.

Writing in The March To Folly, Pulitzer-winning historian Barbara Tuchman argues that such follies stem from 'wooden-headedness'. A common phenomenon through history are policies of governments that ran contrary to their national interests, she adds.

The rulers of Troy dragged the 'suspicious looking wooden horse' inside their walls; Charles XII, Napoleon and Hitler invaded Russia despite the disasters suffered by their predecessors, King George III opted to coerce rather than conciliate with the American colonies.

Indeed, the 20th century is replete with similar follies. Every major intelligence failure in the 20th century - China's intervention in the Korean War in 1950, Egypt's surprise attack on Israel in 1973 and the fall of the Berlin Wall in 1989 - was due to rampant failures to 'connect the dots'.

Wooden-headedness - or stubbornness to accept the facts - was largely to blame. In 1950, General Douglas MacArthur was too full of hubris to accept the fact that the Chinese would take on the American military machine. On Nov 9, 1989, CIA specialists were telling then US President George H.W. Bush why the Berlin Wall would not fall any time soon; at that point, another staff member asked the President to turn on the television, which was broadcasting the fall of the Wall.

By the 1990s, one would have thought that wooden-headedness in policymaking would have been eradicated, given decades of research by behavioural economists and psychologists into less-than-rational modes of decision-making. Their conclusions: man is not a computer-like utility maximiser who makes fully rational decisions. This is the result of cognitive biases (a fancy term for wooden-headedness).

For example, people selectively use information that confirms their prejudices (confirmation bias); put a heavy weightage on recent events when predicting future probabilities (availability bias) and overvalue their own skills (overconfidence bias).

Such biases played a major role at the start of the 21st century.

Prior to the Sept 11 attacks, American officials were not unaware of plots to fly hijacked planes into buildings. But confirmation bias - the refusal to accept such a possibility - led to a 'failure of imagination', as the 9-11 Commission pointed out.

The 2003 invasion of Iraq was the result of availability bias - US officials affected by the trauma of Sept 11 sought to avoid a similar intelligence failure. In turn, this led to the overestimation of Iraq's weapons programmes.

Confirmation bias could well afflict American policymakers on the issue of Iran's nuclear programme. They could well be shrugging off disturbing information about Teheran's intentions - say, the repeated threat to use nukes against Israel - and as a result be underestimating the threat.

This does not mean that cognitive biases are altogether bad. After all, they are mental short cuts to help people make faster decisions. Mr Donald Rumsfeld, the former US defence secretary, had the commentariat in stitches when he cited the need to distinguish between the 'known knowns, the known unknowns and the unknown unknowns'.

At the very least, it is useful to be aware of what one doesn't know.

In the end, it is worth noting that for all the powers of artificial intelligence, Google and Apple's Siri, the quality of policy decisions of fallible men remains largely unchanged through the ages. As America's second president John Adams said: 'While all other sciences have advanced, government is at a stand; little better practised than three or four thousand years ago.'