When you look at the performance indicators of the most used apps & websites you’ll systematically see:
- Daily Active Users
How many users interact with your platform everyday? - Average Time on Site
How long do users stay on your platform? - Stickiness or retention ratio
How many users return after their first experience? - Viral coefficient
How many new customers are brought by existing ones?
These are the metrics internet companies follow very closely to measure the success of their products. And there’s only one thing they have in common: they’re just a proxy for attention stealing.
Selling attention spans
… one nudge at a time.
Don’t be fooled by the “Don’t be evil”, “Bring the world closer” or “Live in the moment” corporate slogans: what internet giants are doing is basically bring-in the largest amount of users, for the longest period possible, at the most frequent rate. And they’re having a tremendous success at that.
Why is capturing your attention so important? Because selling eyeballs to advertisers is by far, their main source of revenue:
- 100% for Snapchat (sponsored lenses, geo-filters, stories & discover ads)
- 98% for Facebook (sidebar ads, sponsored stories & promoted posts)
- 87% for Google (Adwords & Adsense)
- 86% for Twitter (promoted tweets, accounts and trends)
To put it bluntly, the job of internet medias is to sell advertisers “available human brain time”. You shouldn’t be surprised that, in 2017, the average internet user spends 15% of his awake life on social networks.
Now that we know why, the question is how: what are the techniques used by advertising giants to keep us coming back for more?
Hijacking our psychological vulnerabilities
… so that we keep fiending for likes.
In recent years, a forgotten psychology school called behaviorism re-emerged as an applied discipline deployed by businesses, governments & social actors. Behaviorism is based on the premise that human behaviour is best understood as a function of incentives and rewards: modify the environment, and you can shape a person’s attitude.
“Understand the box and you understand the behaviour. Design the right box and you can control behaviour.” — Ian Leslie, journalist & brand strategist.
Today, “the box” we spend most of our time in is phones & computers. That’s precisely why all Product Design and User Experience degrees have some sort of behaviour design course: learn how to exploit a user’s mental biases & inclinations and you’ll directly influence the daily choices he makes.
Let’s take a look at some of the techniques they use.
Variable rewards
…also know as “The virtual slot machine”
In 1930, B.F. Skinner, a psychologist at Harvard University placed a rat inside a box. The box had a lever that would, once pressed, make some food drop. After knocking the lever a couple of times by accident, the rat learned: whenever he would be in that box again, he would go straight to the lever and press it.
Skinner deduced that this type of environmental conditioning could be applied to anyone, including humans. His device became known as the Skinner box and his theory - that the reward reinforced the behavior - was called behaviorism.
Now here comes the interesting part: in further experiments, B.F. Skinner discovered that if the rat got the same reward each time he pulled the lever, he would do it only when he was hungry. BUT, when the received reward is variable (sometimes nothing, sometimes a lot) the rat just keeps pushing the lever all day long. It turns out that making rewards unpredictable is a very efficient way to hook people on something.
Have you ever mechanically opened Facebook just to check if you have new notifications? Or frantically refreshed your inbox with the secret hope that a new email will appear? If you did, you experienced first hand the capacity of variable rewards to trigger impulsive behavior.
Casinos know that very well and have massively expanded the space dedicated to slot machines, which are nothing but variable rewards dispensers. The result? Today, slot machines are twice as numerous as ATMs & generate more revenue in the US than movies, baseball, and theme parks combined.
“Every consumer interface is becoming like a slot machine: it’s about looping people into these flows of incentive and reward.” — Natasha Dow Schüll, Cultural Anthropologist at NYU
Allowing users to choose the frequency of notifications is a fairly easy feature to implement: why can’t we ask Instagram to show us new notifications only once per day? Or Twitter to batch notifications and nudge us only when 20 of them have stacked? Simply because this goes against the economic imperative of maximising time on device: they want us to swipe, scroll and tap everyday, multiple times per day.
It turns out the most efficient way to lock people and make them lose their rationale is to dispense random rewards on a variable schedule. Another example of this, is the widespread “pull to refresh” gesture: apps are perfectly capable of continuously updating the content without any user intervention. So why mimic the slot machines’ lever behavior? Because the “pull” action provides an addicting illusion of control that will push you to do it again.
Choice determination
… also know as “The framed will”
“Whoever controls the menu controls the choices.” - Tristan Harris, former Google Design Ethicist
Technology - by giving us a certain set of choices - conditions our behavior. We’re often lazy and don’t really take the time to think about what is not proposed, what’s outside the menu. Our life-choices are increasingly filtered through a few widespread apps that come with a specific array of options:
- Should I talk to this person?
Tinder - Yes | No, 100% based on carefully edited photos.
Real life - He’s kinda of cute| Let’s give her a chance | I’ve got nothing to lose | He reminds me of someone | Hell no! | She’s got some charm…
→ An oversimplification of human relationships is induced by the swiping gesture: everything is based on looks & profile crafting. - Are there any nice bars here?
Google maps - 63 bars in a 3km range sorted by popularity
Real life - There’s a bar that just opened near the canal| Do you hear the music coming out of that pub? | Let’s walk and see | Look, Jamie is here!
→ The overwhelming amount of possibilities offered by crowdsourced apps leads to distraction, decision fatigue and paralysis. - What’s happening in the world?
Twitter - Since your last visit, there’s been 345 new tweets from 67 of the 452 accounts you follow. Here are a few suggestions based on your previous activity & similar accounts.
Real life - I’ll buy a newspaper | Let’s talk about it | Not sure if I wanna know | I’m gonna follow that debate on the radio
→ The polarising effect of filter bubbles is dangerous: being totally separated from information that disagrees with your viewpoints is an intellectual dead end. Reality is very often much more nuanced.
Any pollster, restaurateur or marketer knows that options can be designed to influence choices. That’s actually the main purpose of A/B testing: let empirical data teach us how to present information in a way that leads the target to act as we want her to.
- Want to increase the subscription rate of the team plan?
Add a “most popular choice” tag on top of it. - Want to force people to pick choice n°2?
Make choice n°1 & n°3 unreasonable or too extreme.
Today, the “option provider” has far too much power. But awareness is a great beginning: by increasingly paying attention to the possibilities we’re given, we’ll eventually realise that they’re often not aligned with our true needs.
Social approval
… also known as “The love crave”
Facebook has put a lot of effort into their face recognition algorithms. And the results are impressive: apparently their models can recognise human faces with 98% accuracy & identify a person in one picture out of 800 million in less than five seconds.
To date, the main outputs of this technology are automatic tag suggestion on uploaded photos (“Is this John?”) and automatic notification of photos where you appear (even if you’re not tagged).
Why is this relevant? Because social belonging and recognition are among the highest human motivations . By artificially increasing the amount of time my face is tagged on a public space, Facebook relies on a universal human craving to nudge me back into its platform. This combination of machine learning and clever design is made to multiply the frequency at which billions of people have their social approval put to a test.
The same reasoning applies to new profile pictures: because these updates are all about ego-boosting, they will be given special prominence in your friends’ news feed (versus status updates or photo uploads for example).
In the end, any signal of increased social vulnerability will purposely reach further and longer. The goal is to trigger a flow of likes & comments that will draw you in with remarkable efficiency…
Autoplay & infinite scrolls
… also know as “The force-fed treat”
Today, most of online experience are infinite: Youtube, Netflix or Vimeo keep playing new videos as long as you don’t pro-actively click “stop”; and most social networks open on a never-ending feed of fresh posts.
But this wasn’t always the case. It is just another way to make visitors stick. Social & media feeds are engineered to alleviate any reason you might have to leave the website: no action required, no time to reconsider, fresh content systematically pushed in front of your eyes. By shifting the effort from accessing new content to actually stopping the flow, you mechanically push visitors to over-consume.
“Change, comes not from the inside, but the outside. If you want people to lose weight, give them a smaller plate. You have to change the environment.” — Dan Ariely, Professor of Psychology at Duke University
This works in every aspect of life: Cornell University professor Brian Wansink demonstrated that individuals who ate soup out of self-refilling bowls ingested 73% more calories than those with normal bowls, without feeling any more satiated.
Artificial friction
… also known as “The power of defaults”
Put a ? by default, and less than 10% of users will change it.
From a darwinist perspective, our minds have evolved to naturally take as many shortcuts as possible in order to save both mental & physical energy. The direct consequence is that the easiest option is very often what we’ll take.
In a technological world, the easiest option would be the one where we don’t have to do anything: whatever is framed as “default”. On the contrary, if you want to prevent something from happening, just make it difficult!
A few examples:
- Want to increase the amount of subscribers to your newsletter?
Automatically subscribe users on account creation. - Want to gather analytics on your app’s usage?
Make it an opt-out that is hidden in the settings menu. - Want to reduce churn?
Add a confirmation step to your cancellation process. - Want to reduce the volume of support requests?
Force users to open your FAQ and if they still insist on talking to someone, ask them to book a chat slot in advance.
This is one of the most powerful design tricks ever: easy to implement and hugely impactful. Do you know why 99.9% of French citizens donate their organs whereas only 4.25% of Danish do the same? In France, the consent for donating your organs is presumed (default) whereas it has to be explicit in Denmark (opt-in).
Read receipts, typing & activity indicators
… also know as “The social ambush”
WhatsApp’s double blue tick, imessage’s “read”, Facebook’s “seen”: today, most of messaging platforms have read receipts and typing indicators activated by default. This has a triple effect on day to day conversations:
- Increased answer probability.
Because your friend knows precisely if and when you read his message, you’re sort of forced to answer. - Greater pressure on answer delay.
In an ever-connected world, a delay in response is usually viewed as rudeness, a brush-off, or the expression of irritation. If you don’t want to feel bad or make people sad, you often have no choice but to answer fast. - Increased length of conversations.
Both participants knowing when the other is online makes every answer an inferred commitment for a 5–10 minutes dialogue.
That’s precisely why social networks such as Snapchat & Facebook don’t allow you to turn read receipts off: each message becomes a legitimate & recurrent opportunity to suck you back in their never-ending content vortex.
Combined with push-notifications, I feel that read receipts is the design innovation that had the most effect on our generation’s attention span. Staying focused while being constantly interrupted by messages that you’re socially committed to answer is a daunting task.
If you’re truly about it you put your read receipts on
— Matt Farias (@m_farias9) February 21, 2018
The debate is real.
It’s not that you’re not interesting, or that I’m not grateful for your invitation or that your jokes aren’t funny: it’s just that I’m in the middle of a movie or trying to finish a book, or writing an (amazing ?) article.
Being able to regularly isolate yourself from the distractions of the world is a precious skill that should be nurtured: I can’t think of any meaningful work I’ve ever produced without multiple, long strings of calm and concentration. Even without any “productivity” drive, people tend to feel stressed when presented with too many choices.
As chat has replaced email which has replaced postal mail, we’re increasingly shifting from asynchronous conversation to live messaging; and slowly losing our breathing space in the process…
Acknowledge your mind’s weaknesses
… and modify your environment accordingly.
I feel that the civil society is increasingly aware of the influence tech products exert on our behavior: initiatives such as Time Well Spent and the National Day of Unplugging are doing a great job raising awareness on this topic.
“You can’t get people to do something they don’t want to do." — B.J. Fogg, director of Stanford Persuasive Technology Lab
This is critical: the level of profit and data accumulated by tech giants is unprecedented. As each day passes, they’re getting better and better at analysing our mind’s weaknesses and designing experiences meant to exploit them. Without public awareness and legal regulation, it’ll be increasingly difficult for new generations to exercise their free will, as an ever increasing amount of mental energy will be required to focus.
Next time you unlock your phone, take a step back and ask yourself: am I making a conscious choice?
This article was originally published on Georges' Medium page.