The wisest choices depend on instinct and careful analysis
Flash Boys by Michael Lewis was summer reading for me and many others. The engaging style but abrupt conclusion took me back to Moneyball, the book Lewis wrote as distraction from attacks on the financial services industry. Even after downloading the rules of baseball, I could not understand what was going on. But I got the general drift. Statistical analysis of the records of players proved a better guide to team selection than the accumulated wisdom of experienced coaches.
Another lesson, important for business strategy, was that the advantage gained by Lewis’s sporting heroes – Billy Beame and the Oakland As – from sabermetrics, the statistical study of baseball, did not last long. If the only source of competitive advantage is better quantitative analysis, such an advantage can be rapidly and accurately imitated.
The commercial shelf life of mechanised innovation is short. The same is true of qant strategies in the finance sector. Enduring benefit from such methods can only be obtained in fields such as medicine, where no competitive element is involved – but there the absence of competition enables doctors to resist the application of computer-based diagnostic tools.
Yet at the same time, another group of recent books proclaims the virtues of instinctive decision-making. Malcolm Gladwell’s Blink begins with accounts of how experts could identify the Getty kouros as fake immediately, even though it had supposedly been authenticated through extended scientific tests. Gary Klein has for many years monitored the capabilities of experienced practical decision makers – firefighters, nurses, military personnel – who make immediate judgements which are vindicated by the more elaborate assessments only possible with hindsight.
Of course, there is no real inconsistency between the two propositions. The experienced coaches disparaged by sabermetrics enthusiasts were right to believe they knew a lot about spotting baseball talent; they just did not know as much as they thought they did. The art experts and firefighters who made instantaneous, but accurate, judgements were not hearing voices in the air. But no expert can compete with chemical analysis and carbon dating in assessing the age of a work of art.
There are two ways of reconciling the judgement of expertise with the power of analysis. One takes the worst of both worlds and combines the overconfidence of experience with the naive ignorance of the quant. Bogus rationality seeks to objectivise expertise by fitting it into a pre-specified template. It is exemplified in the processes by which interviewers for jobs, and managers who make personnel assessments, are required to complete checklists explaining how they reached their conclusion in the light of prescribed criteria. The reality of these exercises is that the interviewer or manager forms a judgement about the individual and completes the form to ensure consistency with the assessment already made. That exercise is a waste of time, but no worse; more serious damage is done when an initial evaluation is influenced not just by the judgement of the appraiser but by the ease with which the evaluation can be defended by reference to the imposed criteria.
Similar risks arise when models are developed based on elaborate spreadsheets, and the modellers then fill the empty cells by guessing what missing numbers should be. They will as a rule seek expert advice on making up the numbers but since most are known at best only within wide ranges they can readily be selected to yield whatever answer the problem setters wanted; one consistent with the conclusion that the decision maker has already reached. And so we have a large consultancy business of transport modellers, environmental experts, risk managers and impact assessment modellers, the front line of an army that has turned evidence based policy into policy-based evidence.
These procedures cloak often casual instinctive assessments in an appearance of objective justification. Instead of the worst of both worlds we should seek the best, combining the value of experienced or assessing major projects judgement with the data-processing capabilities of information technology. We will never succeed in evaluating works of art, choosing candidates, managing risk, without the skills that are only available through experience; but that experience is always capable of being enhanced by the power of data analysis and the implementation of scientific techniques. True expertise can never provide a full objective justification of the judgements that emerge; to believe that it could is to misunderstand the nature of true expertise.