Year in Review: 'We should be paying more attention to the algorithmic distortion'
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

Year in Review: ‘We should be paying more attention to the algorithmic distortion’

Lam Thuy Vo, senior reporter for BuzzFeed News, shares her thoughts on 2019 and the intersection of technology, society and social media.

Mis- and disinformation moved up the news agenda over the last 12 months as researchers, journalists and the public faced unprecedented problems navigating the online news ecosystem. Information Disorder: Year in Review hears from experts around the world on what to expect and how to be more resilient in 2020.

Lam Thuy Vo is a senior reporter for BuzzFeed News where her area of expertise is the intersection of technology, society and social media data. She recently published Mining Social Media, a book on finding stories in internet data. She is based in New York.

First Draft: What was the biggest development for you in terms of disinformation and media manipulation this year? 

Lam Thuy Vo: All the social media companies have become acutely aware of the issues around election integrity, probably because of what happened during the 2016 US election.

Twitter has recently said that they will not allow for political adverts. Facebook introduced the political adverts archive. I struggle with the idea of how effective it is to cordon off something political versus something that is just on the internet and polarises.

Politics has become much more personal and neat. It is expressed through personal fandom practices but also very personable experiences between a meme-able politician and you. It is less abstract and a lot more personal. We need to understand that political ads are not just ads that are sponsored by politicians but also advertisements that go deep into identity politics.

We should be paying more attention to the algorithmic distortion that goes hand in hand with the decline of local media” – Lam Thuy Vo

If you look at any of the things that the IRA [the St Petersburg-based Internet Research Agency which seeded much disinformation] posted during the 2016 election cycle, it wasn’t actually all that much “Vote for Bernie” or “Vote for Hillary” or “Vote for Donald Trump”. It was a lot more seeding of conflict and seeding of information that would drive people apart emotionally, identity wise. 

What is the biggest threat journalists in your part of the world are facing in 2020 in terms of information disorder?

We should be paying more attention to the algorithmic distortion that goes hand-in-hand with the decline of local media.

My friend Cathy Deng calls it ‘news inequality’ or ‘attention inequality’. It’s not just ‘fake’ versus ‘good’ news that is the problem in how we understand the current news cycle. It’s also about what we pay attention to and how.

Algorithms are fed data — likes, shares, other responses — that capture people’s emotional, knee-jerk reactions and, based on this and other data, they are likely going to stack our timelines with more of the similarly polarising information. Add to that the decline of local news and the need for a lot of national headlines which cater to outrage or give emotional responses to cut through the noise.

So it’s not that nuanced and informative news doesn’t exist, they just don’t get the kind of attention and time they need in a rapidly moving, algorithmically-fuelled news environment. 

And so when we are looking at threats to journalists in 2020, we need to find a way to re-balance our media consumption.

A nutrition pyramid is a good way of understanding a balanced media diet: we are eating all of the cookies and not balancing it out with the less pressing, the less glitzy, the less emotionally reactionary, the less — for lack of a better word — sexy topics because most of what is served to us is delectable, outrageous garbage.

And so if algorithms give us mostly articles that garner the most immediate and extreme reactions, how do we discover information that provokes less extreme emotions and instead makes us think more deeply? How do we shift attention to stories that are unsexy, ugly and complicated but important? 

What tools, websites or platforms must journalists know about going into 2020?

Social media is often used to mean the ‘general public’ but that’s not the case. Different platforms are popular with unique groups of people and even within those platforms, people will coalesce around subjects and interests in hyper-specific ways.

Pew Research Center conducts a survey every year to figure out which social media platforms people use the most. That’s a good guideline to start from. On top of that each platform caters to a different part of our identity. If you want to understand teens and first time voters, Instagram may be a good point of reference. Geography needs to be considered as well: New Yorkers may use social media differently than people in Waco, Texas.

“You need to treat every community on every platform as its own beat, as its own neighbourhood” – Lam Thuy Vo

There is no way for me to say that 4chan is exclusively full of garbage, or that Reddit is just full of trolls. Those are the anecdotes that we peddle in general media but it’s more complicated than that. People may find a lot of utility and support in subreddits about skincare and then read polarising news from hyper-partisan groups on other subreddits.

People need to approach platforms and the communities within them as cordoned off small universes that attract people with specific interests and/or of specific demographics.

Each universe has its own cadence, humour, rituals and rules. You need to treat every community on every platform as its own beat, as its own neighbourhood. That is maybe the most helpful guideline that I can give to reporters in light of the 2020 campaign. 

What was the biggest story of the year when it comes to these issues of technology and society?

There has been a general shift in understanding how misinformation gets peddled. The platforms have definitely been more proactive about signalling that they have changed parts of their infrastructure to accommodate for the upcoming election. They know that this is going to be a big deal, but it is also important to discern that it’s not just malice and intentional spread of disinformation — the idea of ‘fake news’ spread by people who are purposefully making up false stories — that we need to worry about. There’s the peddling of disinformation, conspiracy theories and opinion-laced articles by important political figures, from online influencers to actual politicians.

People are now living in different realities or segregated information universes, if you will. We are now seeing the existence of two different legitimised ways in which to see reality. The New York Times is being constantly referred to as ‘fake news’ by the President. We are not making decisions based on the same set of facts and interpretations anymore. We are seeing very skewed ways of cherry-picking and then politicising facts on one side.

It’s one thing to be like ‘this is one outright lie’. Like someone made up a story about the Pope endorsing Donald Trump. It is another to say ‘here is something that is rooted in fact and happened, and here is all of the politics that we are going to spin around it’. 

When it comes to disinformation in your country, what do you wish more people knew?

People are always looking for technology-based solutions to misinformation. Technology is much more an amplifier and something that takes existing divisions, existing notions, existing impulses that humans have and distorts them and pulls them out of proportion.

When it comes to disinformation in the US, I remember a campaign that I did with teenagers in Alaska in 2017. We had a short workshop and our final conclusion was summed up in one phrase: “I wish people would just take an extra minute before they shared something” or WAM (wait a minute!). My dream would be to embed an extension in every app on social media that delays the ‘Like’ and ‘Share’ buttons, just to delay the instant gratification from an emotional reaction and throw some critical thinking into the mix.

The solution to disinformation is not technologically based. It is about a fundamental literacy around consuming information and critical thinking being overshadowed by emotional reactions.

This is highly ideological but our audience may have stopped looking at different categorisations of information and may not be distinguishing between what is propaganda, what is news, what is a primary source, all of these things that organisations like the News Literacy Project teach. 

If you had access to any form of platform data, what would you want to know?

I would love to know the scope of data that Facebook has on any given user and how their ‘scoring system’ works. How is the data that is collected through social media being added to and merged with other personal data that the company buys from other sources? How is it interpreted within the context of ad targeting? I would love to figure it out a way to break into that.

“Algorithms wilfully ignore serendipity: the possibility of change, of growth of a person into a completely different direction” – Lam Thuy Vo

There has been a big discussion in the civic tech space on algorithmic accountability and algorithmic discrimination during the last year. The data that we use to inform and feed and train algorithms is ultimately going to make the algorithm detect old patterns and then cement old patterns into a decision-making system that is based on more of the old. Algorithms wilfully ignore serendipity: the possibility of change, of growth of a person into a completely different direction. That sucks.

I would want to see what data recipe, decision trees and classifications go into the Facebook machine, how they target users in a certain way and their behaviour. I am a person that loves serendipity. I wouldn’t be in journalism if I didn’t want to learn and to be convinced otherwise.

The fact that we are being pushed towards an information universe on social media that gives me more of the same and cements us in our values and actually makes us into old crotchety people who don’t want to change is a really big loss to humanity and how it can grow. 

This interview was lightly edited and condensed for clarity.

Stay up to date with First Draft’s work by becoming a subscriber and follow us on Facebook and Twitter.