Like in other countries, the specter of hoaxes surrounding the US and European elections brought concerns among Japanese journalists about the dissemination of false and misleading rumors. Japan’s lower house election in late October provided the first opportunity to gauge the scale of the problem in the country.
As soon as Prime Minister Shinzo Abe dissolved the House of Representatives on Sept. 28, the Japan Center of Education for Journalists (JCEJ) and a research team from Hosei University launched a verification project together with journalists from 19 different media organizations. During the election campaign, the project reviewed dozens of questionable social media posts, listing debunked items on the JCEJ blog.
Apart from the JCEJ project, FactCheck Initiative Japan, which was established in June, also verified media reports and remarks by politicians related to the general election held on Oct. 22. Five media organizations, including BuzzFeed Japan, participated in the FIJ coalition.
We faced a number of challenges in putting together the project. Collaboration between rival media organizations is rare in Japan. Although many journalists showed interest in our initiative, and some organizations had worked hand-in-hand in running joint news platforms or holding events, competition meant these organizations were hesitant to share information and collaborate on coverage.
The monitoring of social media also does not tend to play an essential role in Japanese newsrooms and mainstream media outlets. In Japan, where daily newspapers and television remain popular sources of news, only 29 percent of the population uses social networks as a source of news. This discouraged some newsrooms from participating in our project, especially because journalists would already be spread thin for typical election coverage.
For this reason, we had to put a lot of time and effort into conveying the importance of paying attention to misinformation and understanding its impact on voters. We also offered journalists the option of participating anonymously, given the difficulty for some reporters would face in gaining official approval from their respective organizations. Eventually, journalists from 19 national and regional media organizations—including newspapers, TV networks and online news outlets—decided to participate in our project part-time.
How the JCEJ project ran
Every day, JCEJ and a team of nine journalism students supervised by Hiroyuki Fujishiro, associate professor at Hosei University, monitored social media content for one hour, partly using tools like CrowdTangle and BuzzSumo. Some partner newsrooms provided insights about what online communities were discussing.
Our project mainly focused on monitoring content on Facebook and Twitter, which are most popular social news sources in Japan apart from YouTube and Line. The project did not encompass reports by traditional media sources or official remarks by politicians.
The collaborating journalists were sent daily emails containing questionable social media content and posts, each of which included an average of 15 items. We collected a total of 275 items of social media content over the course of 27 days, but didn’t cover some stories to avoid giving added exposure to those that were not being widely shared.
The project’s journalists, who participated whenever they had time off from their organizations, were free to choose which items they would debunk. Debunking decisions were made independently by each journalist, and we listed items verified by three or more of the participants on our blog.
What we saw during the campaign
More than 90 percent of the questionable stories we found came from Twitter. As such, Facebook did not seem to play a large role in disseminating misleading or false information.
We did not encounter “elaborately” fabricated viral content such as famous “Pope endorses Trump” story, or doctored pictures and video. Instead, we saw dubious tweets by individual, anonymous users that often cited reports by traditional media sources or used pictures in misleading ways. We also saw misleading headlines published by right-leaning websites on Twitter.
The misinformation we saw included false claims and hate speech directed at non-Japanese residents relating to the election. Five members of our team debunked a pair of widely shared Twitter posts that claimed it is illegal in Japan for “foreigners” to take part in election campaigning. Although no such law exists, one of the posts had been retweeted nearly 1,800 times and received about 1,000 likes on Twitter as of Nov. 3.
Individual politicians were also frequent targets of false information. A story about an opposition party politician, which we identified as erroneous, was carried by Yahoo News Japan, the country’s biggest news site, and reached around 160,000 people as of Oct. 3. on Facebook and Twitter, according to CrowdTangle.
One of the major lessons we learned from reader feedback was the need for greater transparency. Many of the project’s journalists participated anonymously and on a voluntary basis, while others represented their organizations. Because we wanted to encourage as many journalists and newsrooms as possible to take part in the swiftly initiated project, we did not publish a full list of participants.
In the future, we plan to encourage more organizations to participate openly to boost readers’ trust in the project. Providing a more detailed analysis of how content is right and wrong would also help make debunks more convincing.
By the end of the project, we only published five debunked items on our blog. That’s because many participating journalists found it difficult to identify whether an item was “fake”, and sometimes journalists disagreed over whether content could be labeled as false. 33 other items were believed to be false or misleading by at least one participating journalist, and six items were identified as erroneous by at least two participants.
Toward future collaborations, we plan to provide participants with more training on verification skills and tools, so we can publish more debunks. Setting standards and providing a list of basic steps for the verification process would also help participants work more efficiently and rigorously.
Our project was one of the first examples of collaboration between Japanese media organizations. Yet, despite that fact, it encouraged broader public engagement with issue of online mis- and dis-information. Though the project’s impact on our audience has yet to be analyzed, tweets about our debunks from the JCEJ Twitter account received over 540 retweets and 270 favorites. Furthermore, the project raised awareness among journalists and voters that mis- and dis-information exists on Japanese social media and needs to be addressed.
We are eager to do more analysis on how our work was received by readers, and to pursue a better method of collaboration in the future.