We do it without thinking. On the train to work, or at home on the sofa. Sometimes even under the table at a meeting. For some of us, it’s even part of our job.
But if you think scrolling through your Facebook or Twitter feed is no big deal, take a moment to imagine the offline equivalent:
The posters on your train to work keep changing in line with the items you purchased last week.
An anonymous colleague keeps dropping articles on your desk. You have no idea how they know about your interest in personality tests and vegan cooking.
Political canvassers representing views just a little more extreme than yours randomly appear when you’re in your favourite thrift store, or browsing through magazines at the newsagent.
We all know algorithms aren’t actual people. But they might as well be, because the information they gather is sold to real companies, which use it to make you buy stuff and influence the way you think.
Social media is a dream come true for advertisers. Rather than hoping a random billboard might grab your attention, they simply buy the right to find you by following your online cookie trail.
And as much as we like to consider ourselves free-thinking, independent individuals, big data and advances in statistics have made our behaviour eerily easy to predict.
This cringeworthy video designed to celebrate big data implies that its main advantages are to help us remember our friends’ birthdays, choose clothes and source free cupcakes.
Of course there are enormous real benefits of big data – like helping to make roads safer, improve healthcare and stop the spread of diseases.
But its commercial use presents a moral dilemma.
As we know from economic psychology, people generally base their decisions on the information most readily available to them. It’s called the availability heuristic and as common sense would suggest, it means that we often act on the first thing that comes into our head.
The first thing that comes into our head is usually the information we’ve been exposed to over and over again.
Articles in our timeline which reinforce our existing worldview.
Photographs similar to ones we’ve reacted strongly to in the past.
Groups of people we’ve associated with before.
In other words, a repetition of who we are and the experiences we’ve already had.
The use of mass data sets coupled with algorithms adds a new dimension, without us even realising it.
An essay by WIRED editor David Rowan, which appears in a book titled “What should we be worried about?” opens:
“In a big-data world, it takes an exponentially rising curve of statistics to bring home just how subjugated we now are to the data crunchers’ powers.” He goes on to lament that:
“Any citizen lacking a basic understanding of, and at least, minimal access to, the new algorithmic tools, will increasingly be disadvantaged in vast areas of economic, political, and social participation.”
The problem with cookie-led advertising and links generated by algorithms is that they are covert. There is no one to hold accountable for them.
There is no guy with a roller pasting an ad to the wall of an underground station.
If we open up a print newspaper, we know that every other reader is going to see the same advertisement for a high-power vacuum cleaner on page 6.
We can also reason that another publication might instead be trying to sell readers wellness retreats or flat-screen televisions.
Individually tailored timelines remove that certainty and erode our biggest antidote to advertising: collective cynicism.
Since the links and advertisements we’re seeing change from one second to the next, it’s impossible to develop a coherent narrative about their presentation, let alone construct a common picture of what’s happening.
So, what’s the solution?
Sure, we could quit social media altogether.
But it would be foolish to throw the baby out with the bathwater.
What we really need to do is to be aware of the extent our world view is being shaped by the things people pay for us to be exposed to online.
That is the first step towards reclaiming our capacity for independent thought.