OPINION21 November 2023

Crawford Hollingworth: Eyes wide open

x Sponsored content on Research Live and in Impact magazine is editorially independent.
Find out more about advertising and sponsorship.

Behavioural science Features Impact Opinion UX

The Impact columnist examines the impact of ‘sludge’ and other negative uses of behavioural science and how researchers can keep sludge at bay.

Stage light

I have written previously about the need to use behavioural science carefully; to not abuse its power to influence behaviour. But now the darker side of behavioural science – the world of dark patterns and sludge – is becoming increasingly sophisticated and we need to work harder, as researchers, to identify and call out these dark patterns.

Thirteen years ago, user experience (UX) consultant Harry Brignull sat at his kitchen table to collect examples of what he called ‘dark patterns’ – deceptive or manipulative online techniques that create such levels of friction that it obstructs our best efforts to do something that’s in our interest, such as cancelling an unneeded subscription, making a payment, completing an application form, or selecting a product or service that best meets our needs.

Dark patterns, also known as sludge, are increasingly common. EU researchers recently found that 40% of 399 websites they surveyed contained manipulative practices – from fake countdown timers to ‘pressure sell’ and hidden information, to directing consumers to suboptimal choices. It’s everywhere.

Not only is it more ubiquitous, but it is evolving rapidly and becoming more sophisticated. Better metrics, easier tracking and the ability to do A/B testing have all meant organisations can more easily apply and test dark patterns. Technology is always developing, which allows the more unscrupulous to leverage dark patterns through creative UX design or even more invasive tracking. With more and more of our lives online, we are exposed to it constantly.

The widening and deepening impacts of sludge

Sludge isn’t always intentional. Sometimes, a product or service provider might genuinely believe they are offering something in a way that matches consumer needs. Other times, there might be unintentional friction caused by poor design or clunky technology that makes a consumer journey more difficult than it needs to be.

The effects on the consumer are the same, however, and can include physical impacts – such as loss of time and money, as well as being saddled with suboptimal products and services – and psychological impacts, such as frustration, anger, and even shame.

Many consumers are increasingly vulnerable, given the cost-of-living crisis. They are desperate for deals and value, and financial stress may lead to poor judgement and choices.

There are new types of dark patterns and sludge: some apply existing techniques to new online contexts – for example, in voice assistants and smart devices such as smart TVs and speakers; others leverage existing techniques, such as pressure selling, in new ways.

Research by Johanna Gunawan and her colleagues, published this year, on smart devices or the ‘internet of things’, analysed 57 different smart devices – such as thermostats, doorbells, fridges, TVs and speakers – and found that the average device had 20 instances of dark patterns; most had at least three. One smart TV had 60! One driver of so many dark patterns was the limited interfaces of the devices – meaning a user might have little control over what they could control.

Pressure-selling techniques that leverage feelings of scarcity and urgency have become ubiquitous, but new types are evolving to catch consumers out. We are all used to being told there is ‘only one left’ to encourage us to buy – but how would you react if you received an email, supposedly from a chief executive, telling you that stock of a product you recently browsed was ‘running low’ and no more was expected in, so you might be wise to purchase now?

We are also seeing upselling techniques that try to steer us into buying a more lucrative version of a product or service. For example, while monthly subscriptions to products and services have become ubiquitous, defaulting the consumer to a subscription rather than a one-time purchase could be deemed manipulative – and even deceptive. Large and small online retailers fall foul of this. When combined with one-click ordering, there is also a danger that consumers might purchase a subscription without realising.

Additionally, there have been cases where a premium version of a service has been added to the consumer’s basket automatically, or by default. We have seen this for cross-channel ferry bookings.

Who is steering?

Researchers are also becoming concerned about ‘hypernudging’ – a predictive, dynamic system of nudges in an online environment. They change in response to feedback from consumers, adapting in real time, and can be predictive by using a consumer’s data. Sometimes, it can be useful for consumers; but it can also be harmful, particularly as artificial intelligence becomes more common.

Viktorija Morozovaite, postdoctoral researcher in competition law at Utrecht University, wrote in her 2021 paper Two sides of the digital advertising coin: putting hypernudging into perspective: “The aim [of hypernudging] is to reach the right user, with the right message, by the right means, at the right time, as many times as needed. This process may be visualised as a staircase: it is no longer about a single step placed by the choice architect to steer the user, but multiple steps that might come in different shapes, at different times, all with the goal to gently push them towards a specific outcome.”

For example, voice assistants can steer our purchasing by ordering from a predetermined preferred provider, framing a preferred product to match a consumer’s preferences, or even adjusting recommendations depending on a consumer’s mood. While this could be beneficial – and even perceived as useful – it can alter perception of the choices available, and may not steer people to the best price or quality choice.

Keeping sludge at bay

Regulators play an essential role in consumer protection. In the UK, the Competition and Markets Authority has recently cracked down on false countdown timers for pressure selling and false scarcity messaging. There is also a duty on organisations to ‘self-police’.

Some, particularly in the financial sector, are conducting their own ‘sludge audits’. In Australia, the New South Wales Behavioural Insights Unit has produced its own guide to reducing sludge in government websites – simple tools and checks that will make the consumer journey smoother. Whether researcher, regulator, company or government, they are all using behavioural science as their detective lens to identify and call out these practices.

As researchers, we are in the perfect position to identify dark patterns and sludge – intentional and unintentional. We hear those agonising accounts of consumer experience first hand; we are often auditing the online consumer journey and can spot those bottlenecks and moments of pure friction. Arming ourselves to use behavioural science as a lens or a powerful tool to analyse sludge and show why it is harmful or unfair, we can demonstrate how consumer decision-making can be so easily steered and influenced.

Whether the sludge is intentional or unintentional, all consumer-facing companies would be wise to carry out audits of their consumer journey or processes, to identify and remove sludge – to act as behavioural detectives.

Implications

  • Dark patterns and sludge are becoming more sophisticated as more of our lives are online and as technology rapidly advances
  • Behavioural science is an essential lens to identify and illustrate why these techniques are harmful to consumers
  • Researchers and consumer-facing companies can carry out audits of consumer journeys to identify and remove sludge.

Crawford Hollingworth is co-founder at The Behavioural Architects.

This article was first published in the October 2023 issue of Impact.

0 Comments