Uncategorized

Effective leaders recognise the limits of their knowledge

When I was much younger and editing an economics journal, I published an article by a distinguished professor – one perhaps more distinguished for his policy pronouncements than his scholarship.  At a late stage, I became suspicious of some of the numbers in one of his tables and, on checking, discovered they were wrong.   I rang him.  Without apology, he suggested I insert the correct data.  Did he, I tentatively enquired, wish to review the text and its conclusions in the light of these corrections, or at least to see the amended table.  No, he responded briskly.

The incident shocked me then:  but I am wiser now.  I have read some of the literature on confirmation bias:  the tendency we all have to interpret evidence, whatever its nature, as demonstrating the validity of the views we already hold.  And I have learned that such bias is almost as common in academia as among the viewers of Fox News:  the work of John Ioannidis has shown how few scientific studies can be replicated successfully.  In my inexperience, I had foolishly attempted such  replication before the article was published.

In the modern world, opinions seem to be based more and more on what team you belong to and less and less on your assessment of  facts.  It makes no sense that it is so easy to predict what people will think about abortion from what they think about climate change, and vice versa;  or that those who are concerned about income and wealth inequality tend to favour gun control, and those who are not do not.  

But there are still some who valiantly struggle to form their own opinions on the basis of evidence.  Keynes is widely quoted as having said ‘when the facts change, I change my mind.  What do you do, sir?’   This seems a rather minimal standard of intellectual honesty, even if one no longer widely aspired to.  But, as with many remarks attributed to Keynes, it does not appear to be what he actually said:  the original source is Paul Samuelson (who cannot himself have heard it) and the reported remark is ‘when my information changes, I alter my conclusions’.

There is a subtle, but important, difference between ‘the facts’ and ‘my information’.   The former phrase describes some objective change which is, or should be, apparent to all:  the latter refers to the speaker’s knowledge of relevant facts.  It requires greater intellectual magnanimity to acknowledge that additional information might imply a different conclusion to the same problem, than it does to acknowledge that different problems have different solutions.

But Keynes might have done better to say ‘even when the facts don’t change, I (sometimes) change my mind’.   The history of his evolving thought reveals that, with the self confidence appropriate to his polymathic intellect he evidently felt no shame in doing so.  As  Keynes really did say (in his obituary of another great economist, Alfred Marshall, whom he suggests was reluctant to acknowledge error) ‘there is no harm in being sometimes wrong – especially if one is promptly found out’.

To admit doubt, to recognise that one may sometimes be wrong, is a mark not of stupidity but of intelligence.  A higher form of intellectual achievement  still is that described by Scott Fitzgerald:  ‘the test of a first-rate intelligence’, he wrote, ‘is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function’.  The capacity to act while recognising the limits of one’s knowledge is an essential, but rare, characteristic of the effective political or business leader.   ‘Some people are more certain of everything than I am of anything’, wrote former Treasury Secretary (and Goldman Sachs and Citigroup executive) Robert Rubin.  We know who he meant.