How We Identify Fact-Checkers

The Duke Reporters' Lab looks at many attributes to determine which organizations to add to its database of fact-checking projects around the world.

By Bill Adair & Mark Stencel – June 22, 2016 | Print this article

Updated: July 9, 2020

The database of global fact-checking sites is a project of the Reporters’ Lab at Duke University’s DeWitt Wallace Center for Media & Democracy.

The database and related map is managed by Mark Stencel, a journalism faculty member at Duke and co-director of the Lab, and Joel Luther, a researcher who also works on technology projects that extend the reach and impact of fact-checkers around the world. PolitiFact founder Bill Adair started the database in 2014 in his role as Knight Professor of the Practice of Journalism and Public Policy at Duke, where he also is director of the university’s journalism program.

The fact-checking database tracks several hundred non-partisan organizations around the world. These projects regularly publish articles, videos or audio reports that:

  • verify the accuracy of claims made by prominent public figures and institutions;
  • debunk rumors, hoaxes and other forms of misinformation that spread online;
  • or review the status of political promises made by candidates and political parties 

The Lab considers many attributes in determining which organizations to include, such as whether the site:

  • reviews statements by all parties and sides;
  • examines discrete claims and reaches conclusions;
  • transparently identifies its sources and explains its methods;
  • discloses its funding and affiliations.

We also consider whether the project’s primary mission is news and information. That’s clear when fact-checking projects are run by professional journalists, produced by news media organizations, or affiliated academic journalism education programs. Other fact-checkers are typically associated with independent, non-governmental groups and think tanks that conduct non-partisan research and reporting focused on issues such as civic engagement, government transparency and public accountability.

The Reporters’ Lab criteria is similar to the International Fact-Checking Network’s Code of Principles. But IFCN also has a slightly different mission from the Lab. Verified signatories of IFCN’s code are typically the underlying organizations that produce fact-checks. That’s true even when those organizations produce multiple projects for different audiences. IFCN also deploys independent assessors to verify that each signatory rigorously adheres to the principles, in part to help identify the organizations that meet the highest editorial and ethical standards.

The Reporters’ Lab casts a wider net. We identify individual fact-checking outlets, websites and programs — places where the public can find reliable fact-checking reports, even when those reports have distinctive brands, names or URLs. Examples include PolitiFact’s state-and-locally media partnerships across the United States, as well as the country-specific pages and projects produced by multinational fact-checking organizations, such as Africa Check and Agence France-Presse, among others. While the Reporters’ Lab counts each major satellite, IFCN’s signatory list generally counts the overarching organization.

With the growth of collaborative fact-checking projects around the world, the Reporters’ Lab also aims to identify the contributing organizations that individually produce substantial fact-checking collections of their own. 

We sometimes include projects that present content in multiple languages, but mainly when there is original and distinctive content in each language (not just translations). 

Our database is regularly updated throughout the year and includes both active and inactive projects, which are noted and counted separately. We also try to update the status of organizations that do periodic fact-checking during key news events, such as an election or a legislative session. Projects that mainly do fact-checks during elections remain “active” in our database if they have track-records of consistently doing this kind of reporting over multiple election cycles.

If you have additions, edits, updates or questions, please contact Mark Stencel or Joel Luther.