Research on CrossCheck journalists and readers suggests positive impact for project
First Draft uses cookies to distinguish you from other users of our website. They allow us to recognise users over multiple visits, and to collect basic data about your use of the website. Cookies help us provide you with a good experience when you browse our website and also allows us to improve our site. Check our cookie policy to read more. Cookie Policy.

This website is hosted in perpetuity by the Internet Archive.

Research on CrossCheck journalists and readers suggests positive impact for project

In-depth interviews and surveys found that the collaboration was credible, and taught verification skills

In the unease that followed a U.S. election in which voters were bombarded with mis- and dis-information, First Draft pulled together 37 newsrooms, universities, nonprofits and tech companies to fight rumors and fabrications around what was widely perceived to be their next target: the 2017 French presidential election.

That project, which we called CrossCheck, was created with the idea that collaboration between journalists could improve the way they monitor and debunk misleading content, and build trust with audiences. And those assumptions—along with those embedded in the design of our debunks—seemed reasonable enough.

However, the fragility of our modern information ecosystems demands more than just good guesswork.

We at First Draft are increasingly concerned about how coverage of mis- and dis-information can fuel its spread, rather than stifle it. That’s why we’ve devoted ourselves to testing the ideas we bring to our work. And that’s why, when CrossCheck concluded in May, we hired a number of independent researchers to study its impact.

Today we release the results of dozens of in-depth interviews with CrossCheck journalists and readers, conducted by professors at University of Toulouse and the University of Grenoble and an independent researcher.

To our excitement, the feedback from these interviews is overwhelming positive. CrossCheck appears to have gained the trust of a large and politically diverse audience. Participating journalists learned to be smarter about when and how to report on falsehoods, and journalists and readers learned valuable skills for assessing information and its sources.

Below is a more complete list of key findings. The full report can be downloaded here.

While we are pleased with the results of this study, we recognize that there’s much more to left to do. This research is only the beginning of the evaluative work we hope to put together: Dr. Lisa Fazio of Vanderbilt University is now finishing up experimental work evaluating the persuasive effect and memorability of CrossCheck’s debunks.

We will take what we learn from these findings, integrate them in subsequent projects and educational resources, and test a new list of hypotheses about our next major project—which is on its way.

Key findings:

Impact on newsrooms and journalists

  1. Participants agreed that debunking work should not be competitive. In fact, it should be considered a public service.
  2. Journalists who took part in the project reported learning new skills, which they continue to use in their newsrooms today.
  3. The process of working transparently, or having to ‘show your work’ to newsrooms that would otherwise be seen as competitors, resulted in higher quality journalism. Participants explained they were able to hold each other to account.
  4. Collective, editorial decision-making allowed otherwise competitive newsrooms to make joint decisions about what to report and what to strategically ignore, so as not to provide oxygen to rumours.
  5. Overall, the public’s contributions were useful and diverse, which provides an important (but hopefully an unnecessary) reminder to include the audience in future journalism collaborations.

Impact on audiences

  1. Having multiple newsrooms collaborate on stories meant that respondents had increased levels of trust in the reporting. Respondents felt that CrossCheck was more independent, impartial and credible because it included so many outlets.
  2. Respondents noted that the technique of explaining how a rumor or piece of content was fact-checked or verified increased trust in the article, and also helped them to learn how to do this work themselves.
  3. As well as learning critical reading skills, there is also evidence that audience members learned to be wary of content with particularly emotional language or visuals.
  4. Respondents explained how they had shared CrossCheck stories and information with friends and families, both online and off, who were sharing disinformation around the election.
  5. The fact that the project included local outlets appears to have been one of the reasons why the project reached people across the political spectrum. The perceived impartiality of the project was also one of the reasons that it appealed to a wide audience.

Download the report here.