Mis- and disinformation moved up the news agenda over the last 12 months as researchers, journalists and the public faced unprecedented problems navigating the online news ecosystem. Information Disorder: Year in Review hears from experts around the world on what to expect and how to be more resilient in 2020.
Kevin Nguyen is a digital forensics reporter and producer with the Australian Broadcasting Corporation (ABC). He is an expert in digital verification based in Sydney, New South Wales.
First Draft: What was the biggest development for you in terms of disinformation and media manipulation this year?
Kevin Nguyen: Facebook has been the primary offender and facilitator of disinformation this year. Their refusal, and subsequent doubling down, to fact check their targeted advertising is one of the biggest threats to not just civil discourse but our collective understanding of reality.
Another egregious transgression from Facebook was the discrete and sudden removal of a number of transparency tools, including Graph Search. While there were definitely arguments for its removal, it was done so without consultation — entire life-saving projects were shut down overnight.
The tool was used to unearth agents of disinformation and track their reach, it was used to assist in gathering and verifying eyewitness accounts of war crimes and human trafficking, and it was used as a secondary digital fact-checking measure. Facebook remains the least transparent platform in relation to its user base and none of the sweeping changes they made in the past year appear to have hindered their ability to harvest user data.
What is the biggest threat journalists in your part of the world are facing in 2020 in terms of information disorder?
The largest demographics, particularly those who are single-issue voters, are also those with the least media literacy and therefore susceptible to misinformation efforts. This is highly problematic. It doesn’t matter how effective we are if the section of our readership who needs to hear us the most, isn’t willing to engage with us or don’t trust us.
After 2016, I have trained literally hundreds of journalists in verification and fact checking globally and I’m confident the skill-floor has risen across the industry. But we’re not getting better at stemming the tide of misinformation and it’s partly because we haven’t yet been able to pass that education onto our readers.
What tools, websites or platforms must journalists know about going into 2020?
Machine-learning and algorithmic approaches to journalism. It’s not just a time-saver, there is now so much data that a content scrape, visualisation and an undergraduate level understanding of statistics can generate very compelling and very important stories. If there is one programming language you have to learn, it should be SQL.
What was the biggest story of the year when it comes to these issues of technology and society?
De-platforming and de-hosting fringe and hate sites, particularly in the fallout of several high-profile mass shootings in the first half of the year. Attention, recognition and notoriety are primary motivators in many mass shooters and companies are now rejecting hate speech as an entitlement, and more importantly accepting hate speech has consequence — it’s not just words, it’s a kind of virus which cultivates and multiplies in closed environments.
Also, once again, the stories around Facebook and political ads.
If you had access to any form of platform data, what would you want to know?
Demographic data is the holy grail — it’s why politicians and corporations are willing to splash millions for what is effectively a never-ending spreadsheet. Once we understand how information is targeted and served to a user, and also how a user is broken down into different data points we essentially would have everything we need to launch effective counter-misinformation efforts and interference.
“It is not actually in the interest of the government to enforce privacy measures on social media” – Kevin Nguyen
At the moment we have to treat it as a black box and make inferences based on its input and output. For example, with YouTube algorithms we can ascertain white-nationalist content is served primarily to those who watch certain internet personalities, and we learnt this by creating a number of shell accounts and have them behave or consume content in a specific way. Imagine if we could know the top 20 variables which leads to a 16-year-old being recommended #GamerGate content? We could proactively fact check, we could map a network of misinformation and ignition points for certain tropes and we could predict the purpose and desired outcomes of political messaging.
When it comes to disinformation in your country, what do you wish more people knew?
That it is not actually in the interest of the government to enforce privacy measures on social media. The relationship between the social media giants and law enforcement, in Australia and elsewhere, is complicated and those closed-door conversations between these companies with police and intelligence agencies are often detrimental to the user. These agencies have at their fingertips what is essentially an extensive spying apparatus and it would not be in their best interest to see the likes of Facebook and Twitter gone.
This interview was lightly edited and condensed for clarity.