Effective leaders recognise the limits of their knowledge

999

When I was much younger and editing an economics journal, I published an article by a distinguished professor — more distinguished, perhaps, for his policy pronouncements than his scholarship. At a late stage, I grew suspicious of some of the numbers in one of his tables and, on making my own calculations, found they were wrong. I rang him. Without apology, he suggested I insert the correct data. Did he, I tentatively enquired, wish to review the text and its conclusions in light of these corrections, or at least to see the amended table? No, he responded briskly.

The incident shocked me then: but I am wiser now. I have read some of the literature on confirmation bias: the tendency we all have to interpret evidence, whatever its nature, as demonstrating the validity of the views we already hold. And I have learnt that such bias is almost as common in academia as among the viewers of Fox News: the work of John Ioannidis has shown how few scientific studies can be replicated successfully. In my inexperience, I had foolishly attempted such replication before the article was published.

 It is generally possible to predict what people will think about abortion from what they think about climate change, and vice versa; and those who are concerned about wealth inequality tend to favour gun control, while those who are not, do not. Why, since these seem wholly unrelated issues, should this be so? Opinions seem to be based more and more on what team you belong to and less and less on your assessment of facts.

But there are still some who valiantly struggle to form their own opinions on the basis of evidence. John Maynard Keynes is often quoted as saying: “When the facts change, I change my mind. What do you do, sir?” This seems a rather minimal standard of intellectual honesty, even if one no longer widely aspired to. As with many remarks attributed to the British economist, however, it does not appear to be what he actually said: the original source is Paul Samuelson (an American Nobel laureate, who cannot himself have heard it) and the reported remark is: “When my information changes, I alter my conclusions.”

There is a subtle, but important, difference between “the facts” and “my information”. The former refers to some objective change that is, or should be, apparent to all: the latter to the speaker’s knowledge of relevant facts. It requires greater intellectual magnanimity to acknowledge that additional information might imply a different conclusion to the same problem, than it does to acknowledge that different problems have different solutions.

But Keynes might have done better to say: “Even when the facts don’t change, I (sometimes) change my mind.” The history of his evolving thought reveals that, with the self-confidence appropriate to his polymathic intellect, he evidently felt no shame in doing so. As he really did say (in his obituary of another great economist, Alfred Marshall, whom he suggests was reluctant to acknowledge error): “There is no harm in being sometimes wrong — especially if one is promptly found out.”

To admit doubt, to recognise that one may sometimes be wrong, is a mark not of stupidity but of intelligence. A higher form of intellectual achievement still is that described by F Scott Fitzgerald: “The test of a first-rate intelligence,” he wrote, “is the ability to hold two op­posed ideas in the mind at the same time and still retain the ability to function.”

The capacity to act while recognising the limits of one’s knowledge is an essential, but rare, characteristic of the effective political or business leader. “Some people are more certain of everything than I am of anything,” wrote former US Treasury secretary (and Goldman Sachs and Citigroup executive) Robert Rubin. We can imagine which politicians he meant.

 

This article was first published in the Financial Times on August 5th, 2015.

Print Friendly, PDF & Email