How the defaults around you are shaping what you do and who you are, and how you can rethink them.
Imagine this. You’re at the supermarket till. The checkout assistant is picking your items up off the conveyor belt, scanning them, and placing them to one side. What do you do next?
If you live in the UK, until a few years ago you’d have put your shopping in a number of plastic bags, freely available at the end of the till. And you’d have been in good company, too. In 2014, customers in major supermarkets in England went through more than 7.6 billion single-use carrier bags.
A year later, the government introduced a small charge for them, and the impact was transformative: the number of bags used since then has plummeted by 95%, and nearly £180 million ($235 million) has been raised for good causes from the revenue collected.
Now think back and ask yourself why you — and so many others — used those plastic bags. The answer is quite simple: because they were there. This is an example of a default, a pre-set course of action that takes effect if you don’t specify a different choice. When the government made it even mildly inconvenient to use plastic bags — the charge was only 5p (7 cents) initially and has only increased to 10p (14 cents) since — we modified our behaviour. The pre-set action went from free bags to charging overnight. These days you probably plan ahead and bring something with you to carry your shopping home in.
There are numerous other examples. Since 2012, UK employers have been gradually required to enrol their eligible workers into a workplace pension scheme. By removing the requirement for employees to opt in, more than 90% of workers are building up savings and putting themselves on course for a stable retirement, compared to less than 50% previously. By contrast, in the US, where it’s not yet the default for employers to enrol their people, it’s estimated that half the nation’s workforce do not have access to a workplace retirement savings plan.
Defaults shape the way we are and the things we do without us necessarily realising that’s the case. They are one of the most potent influences on our behaviour.
But there’s a dark side to defaults. Not all of them have our best interests at heart. Some are designed to control, exploit or extract. Perhaps some of these are affecting you right now, too.
And it gets even more sinister than that.
If a default option is simply what happens when you do nothing, then, to quote the godfather of nudge theory, Richard Thaler, “Sometimes even when you do nothing, something happens.” This can exclude large swathes of the population, and the impact on women and minorities can be devastating.
One example is male bias. In her brilliant book Invisible Women, Caroline Criado Perez talks about a gender “data gap”, where data gathered predominantly from men is used to build products that consequently exclude women. This is evident in everything from voice recognition software that doesn’t recognise female voices, and phones created for male hands, to the ways cities are structured to favour men, leaving women feeling unsafe and even impeding their access to work, a social life, and transportation.
A similar data gap exists for ethnicity. In 2020, Twitter apologised after its image-cropping algorithm was found to be automatically focusing on white faces over black ones. Zoom has also been accused of racial bias, following reports that black people were fading into their virtual backgrounds. The reason? The default assumption for both was that the user was white.
The consequences of this can be severe. Joy Buolamwini, a Ghanaian-American computer scientist and digital activist based at the MIT Media Lab, has found that racial bias built into facial recognition systems trained to identify white males might cause them to misidentify people of colour. With the rise of facial recognition being used in criminal investigations, this could result in someone being unfairly labelled a criminal for life.
Although data sets can be updated, or gaps filled in, with promising emerging techniques like synthetic data, this is easier said than done, and assumes an understanding of the inherent biases in data sets or of the blind spots that are still present in areas like design, software development, town planning and policy making.
Major disruptions and times of crises have proved fertile ground for innovation and shifts in the way we think. The 2008 financial crash led to a flurry of groundbreaking challenger banks and innovative products that promised to democratise financial services. The United Nations and the European Union were both born in response to the devastating impact of World War II.
COVID-19 has been another such shock. Despite its horrors, the pandemic has also reflected our ingenuity, innovation and ability to move and work quickly, together. It has forced us to question defaults that, in retrospect, were acting as barriers to progress. For example, we saw unprecedented global collaboration to get vaccines developed and approved in record time, slashing the period from two years to a remarkable eight months.
At Brink we have seen this shift play out across much of our work. By tearing apart traditional processes, we’re able to do deals and realise fresh ideas within a matter of weeks. And like so many other organisations, we’re now a globally distributed, remote team with online working as our default — a concept that was unthinkable two years ago.
At a local level, we’re seeing innovators question defaults, using local resources to address their unique needs:
This strikes at the heart of Behavioural Innovation, the idea that it’s people who innovate, and if we want successful innovation, we need to be intentional about designing with people in mind.
We now have the opportunity to think intentionally about the defaults around us: recognising they exist and changing them to shape the behaviour we want to see. The seismic shifts caused by the pandemic allow us to do this at a fundamental level.
Governments on both sides of the Atlantic are talking about building back better. The question is: how will we harness defaults to create a world that works better for everyone?
Further reading
More on defaults from Richard Thaler himself