Jeff French writes: Short-cuts are not just a lesson in geography!

I have written that we often act without a command of all the facts. This is quite natural. Our ancestors on the African savanna may have had to make split second decisions simply to survive. Hundreds of thousands of years of evolution mean that nowadays even in more sedate places with more time to consider things we will still use mental short-cuts.

Thus short-cuts are not just a physical thing to save us time when travelling. They also apply to how our brains process information.

Why is this important?

It’s because any short-cut is likely to lead to some type of bias. As a result we need to understand how they occur so we can take them into account when designing change programmes. They may even save us time in securing change if we know enough about how they apply in a specific context.

These short cuts divide up into a number of types. Understanding them is thus important to understanding why people behave the way they do.

Anchoring. People start with a ‘known’ anchor and adjust in the direction they think is appropriate and are influenced by what they are familiar with. In other words people from a big town or those from a small town are likely to make different judgment calls as to the estimated population of a mid size town. What are the practical implications? Often those organisations like an emergency relief charity might create a range of donation options, so that the higher the starting point the more will be given from the middle ranges. This explains why a big sum to donate will always appear on the list of options

Availability. People assess risks by how readily examples come to mind. Recent events have a bigger impact on behaviour than thing in the past or the potential risks from the future. Thus a direct natural disaster such as flooding is likely to have more of an impact than just the talking about the threat of climate change.

Representativeness. We categorise people events and risks according to similar things. For example we are likely to assume a tall man is more likely to be a basket ball player than a short one. This constant searching for categories and patterns, whilst an incredibly powerful thing, also means we can bias towards spotting patterns, when the do not exist. In the second world war the mapped assessment of V1 Doodlebug explosions in London initially was thought to be down to complicated guidance systems and targeting, until mathematicians demonstrated that the patterns that people thought they saw were down to a random distribution and patterns inherent in the maps that were used.

Value Attribution. We also tend to imbue someone or something with qualities and values that they may not hold. This then influences and alters the way we then perceive them and how we receive subsequent information about them. Once we have made such a categorisation, it makes it more difficult to change our mind even when in the face of objective evidence to the contrary. For example the virtuoso violinist Joshua Bell, will not be seen by most people not in the know as a virtuoso when he also plays on the subway prior to a concert!

As can be seen short-cuts are inevitable. They save us vast amounts of time processing information. They may have saved our ancestors lives against predators, but they also now help us process the vast amounts information we receive in a complex modern society.

The biases they create impact on our behaviour, but they also mean we can use them to change or reinforce behaviour too. I will explore this in more detail in a future posting

Professor Jeff French is a non-executive Director of The Campaign Company, a professor at Brunel University and a Fellow at Kings College University of London. He founded and established the National Social Marketing Centre in England and currently is chief executive of Strategic Social Marketing Ltd. He will be a keynote speaker at the 2nd World Social Marketing Conference in 2011 in Dublin

Leave a Reply

Your link text