Science

Social media & memory

Facebook decides what we forget


Social media doesn’t only wield power over the present. It also keeps record of the past. And that means it decides what we forget. ‘Our memories are being automated’, Rik Smit warns.
By Christien Boomsma / Translation by Sarah van Steenderen / Photo Jon Gurinsky

Six months ago, they suddenly disappeared: approximately two thousand videos had been removed from YouTube. Not because they were of a pornographic nature, or because they violated copyright. No, the videos had been flagged for violence and hatemongering. The problem was they had been uploaded by Syrian opposition forces to share footage of the aftermath of air strikes and soldiers executing prisoners. In many cases, the footage concerned war crimes.

Rik Smit, assistant professor of media studies, says this is happens all the time. A relatively small change in the algorithm the company uses on the millions of videos that are uploaded every day can cause everything to be removed from the largest video archive in the world.

This is alarming. After all, YouTube can’t be held accountable. It’s a commercial enterprise and can do whatever it wants. Nevertheless, it is responsible for managing a large part of our collective memory.

In motion

Smit is interested in this memory. Or, to put it more clearly: he wants to know how we remember things and how social media influences what we remember. ‘Just like history, our memory is always in motion’, he says. ‘It’s not a stable entity. Memory is an experience that is transported and stored over time. It’s entangled with history and emotion.’ Next week, he is getting his PhD for his research.

Consider the photo album that grandma takes out on her eightieth birthday to show Aunt May and Uncle Ben. Or the museum you visit with friends to see an exhibition on the Second World War. These recorded histories determine our experience of the past.

Whenever a new kind of technology is added, people start this process of moral panic

These days, social media also plays a role in our memories. Media sites can show you endless footage of Syrian civil rights movements. Or a picture you uploaded four years ago: This day four years ago. ‘But these types of media also select things for you. They play an active role in memories’, says Smit.

Moral panic

He’s not trying to sound defeatist about the existence of Facebook, YouTube, Wikipedia, and the like. ‘Whenever a new kind of technology is added, people start this process of moral panic. This dates back to when the printing press was invented. I’m trying to showcase the nuances.’

Because just like the history books, or the statue commemorating those who fell during World War Two, or your grandma’s photo album, social media determines how we remember things. But Smit wants to figure out how this works.

To that end, he studied three clear cases. On YouTube, he focused on eyewitness videos of the poisonous gas attacks in the Syrian town of Ghouta in 2013 – Obama’s ‘red line’. On Facebook, he studied the page ‘Justice for Mike Brown’ – the unarmed 18-year-old African-American who was shot by police in Ferguson. Finally, Smit studied the creation of the Wikipedia page on the MH17 airplane crash.

In common

All three platforms have one thing in common. Smit observed that they are all controlled by a small small handful of people who know the rules of the game and can therefore wield disproportionate influence over the story that millions of people will remember.

In reality, a very small group of people decides what is actually posted in Wikipedia

Take the Wikipedia page on the MH17 crash. ‘I used to be a fan of Wikipedia’, says Smit. ‘It’s a great initiative. It’s the greatest distributor of knowledge right now, and anyone can edit it. But in reality, a very small group of people decides what is actually posted.’

He studied the talk pages behind every entry on Wikipedia. For the MH17 page, that meant poring over approximately six hundred printed sheets of paper. ‘It starts off quite simply. Someone creates a page and people start writing’, says Smit. ‘But then, an increasing number of Wikipedians start getting involved. It’s not factual enough, it’s too emotional, there are too many references to Russian sources, etc.’

Extremely suspicious

The discussion in these talk pages is focused on the fact that only verifiable, neutral sources are allowed. And on the surface, there’s nothing wrong with that. ‘But as soon as someone references an alternative source, something other than CNN or the New York Times, people get extremely suspicious. This then leads to a Western, Northern European perspective – with no room for any alternative views’, says Smit.

He was also taken aback by the power that some Wikipedians wield. Behind the scenes of this neutral platform, there is a clear hierarchy. At the top are the bureaucrats, who can prevent other Wikipedians from editing articles. Next is a committee of more bureaucrats that decides who is allowed to join their ranks. ‘People may think that Wikipedia is neutral, but the website has its own internal politics.’

YouTube is also much less open than Smit initially thought. From the bombings in Paris to Britney Spears falling off stage, people film in en masse and literally everyone can upload a video. And yet – only a small fraction of the videos uploaded are easily accessible.

Eyewitnesses

Smit studied eyewitness footage from the poisonous gas attacks in Ghouta. Only one third of these videos were readily available. Interestingly, the ones he could find hadn’t been uploaded by eyewitnesses, but by the large mainstream media. ‘And then there are the posts by web native media, groups who run their own YouTube pages.’

Whenever something big happens, these companies search through YouTube looking for footage. They then edit that footage and craft it into a story they can suits their own agendas. In this way, story after story is altered and filtered.

Footage also gets lost. Sometimes users take it offline themselves, but changes to YouTube algorithms are responsible as well. ‘Of the thirty-five most-watched videos from 2013, a third have been deleted’, says Smit. ‘It would be interesting to study what material is left, in the end.’

Each click, each like and share contributes to a post’s visibility

Finally, Facebook has the ultimate filter bubble. The page ‘Justice for Mike Brown’, which was created in 2014, quickly gained thousands of followers. Almost immediately, memory became central to the page, Smit says. ‘Who was he? What happened? How do we make sure this is never forgotten? How does this relate to the overall history of black Americans? These were the largest questions.’

Emotion

In this particular case, however, everything was about emotion, says Smit. The image of Mike Brown’s stepfather holding a sign asking for justice became iconic, as did the account from a past teacher saying how sweet a boy he had been. ‘Each click, each like and share, contributes to a post’s visibility’, says Smit. And that ultimately decides which version of the story finally remains. ‘That story is a highly selective one.’

Selective memory is not necessarily a bad thing. Grandma can remove photos from her album. Or, the whole album could go up in flames during a fire. But Smit says there is a difference. ‘The photo album is a physical presence. But YouTube and Facebook make use of memotechnology. They are systems that represent things for us. Our memories are being automated.’

Smit may not want to be defeatist, but he does think this is something we should be aware of. ‘It concerns our public history. Each time we upload something, we trust YouTube or Facebook with our memories. And they don’t have to assume responsibility for anything.’

Smit would like to see a change in this. ‘Maybe we should create a public agency that determines the rules and is accountable and thoroughly considers what we should conserve.’

Nederlands