Picture: REUTERS
Picture: REUTERS

A decade ago, the psychologist Rebecca Lawson published a lovely paper on "cycology". She posed a simple question: could people draw the basics of a bicycle?

About 40%, including those confident in their ability to sketch pedals and forks, found it impossible. Fundamental errors included a bike frame linking the front and back wheels, making it impossible to steer, and the chain looping round both wheels (ditto). Even cyclists struggled. "It seems that many people have virtually no understanding of how bicycles work. .. despite bicycles being highly familiar and most people having learnt how to ride one," Lawson, from the University of Liverpool, concluded.

Our personal knowledge, then, of how even everyday objects function is sketchy and shallow. We are individual ignoramuses who somehow manage to pool intelligence, and then brazenly bask in the collective glory. Lawson’s finding is writ large in The Knowledge Illusion, a recent book that exposes us for the intellectual shams we are. The authors, psychologist Steven Sloman and marketing researcher Philip Fernbach, argue that knowledge is a collective effort — but that we underestimate how little of it we hold ourselves. Intelligence comes from without, rather than within, hence the book’s subtitle: Why We Never Think Alone.

Before Lawson wheeled out her bicycle challenge, the Yale psychologist Frank Keil had already been cataloguing our shaky grasp of how commonplace objects such as zips, mobile phones and flush toilets worked. Almost universally, people thought they could explain them. But when they actually sat down and tried to articulate the mechanisms, most were stumped. Keil and his colleague, Leonid Rozenblit, called it the "illusion of explanatory depth", defining it like this: "Most people feel they understand the world with far greater detail, coherence and depth than they really do." We do not reflect upon or double-check our knowledge nearly enough.

This phenomenon, which also suggests that people become humbler about their knowledge after having their deficiencies exposed, is surprisingly relevant to the way we think about policy. Prof Fernbach and colleagues have previously conducted research suggesting that voters who adopt extreme positions on issues such as how healthcare should be financed, or a trading system for carbon emissions, become more moderate once they are asked to explain the policies. Being forced to engage in "causal reasoning" can, it seems, temper hot thinking.

Here, it is also worth mentioning the Dunning-Kruger effect: the least competent also tend to be the most confident in their abilities. Put another way, the most ignorant are the least aware of their shortcomings. To a degree, this makes sense: it is only by gaining knowledge that the gaps become visible. And very competent individuals are prone to underestimating their abilities — something to bear in mind when a politician championing a cause promises a paradise without caveats.

Keil has argued that, when asked to explain things, we do not reason causally — which explains why we are so bad at it — but instead grab our comebacks from intuition. That’s because the world is a massively complicated web of interrelated things and events, anchored in a vast ocean of causal complexity. Our recourse to intuition — the instant retrieval of information better known as gut instinct — is a sane strategy for reducing cognitive overload.

Technology, of course, allows greater and quicker access to ever more information. Every Google search, and each answer returned, perpetuates and intensifies a person’s belief that he or she is an underappreciated genius. Remember, though, it’s an illusion: technology simply bolsters the fallacy that each of us is an Einsteinian beacon in a world of stupidity.

The writer is a science commentator.

© The Financial Times 2017

Please sign in or register to comment.