Press "Enter" to skip to content

Joel Luther

With half the planet going to the polls in 2024, fact-checking sputters

This year’s elections are a global convergence of ballots, checkboxes and thumb prints, with voters in more than 64 countries and all of the European Union heading to the polls. Based on a Time magazine estimate, this democratic spectacle could potentially involve the equivalent of 49% of the planet’s population.

Just one problem: There aren’t enough referees.

A "Mostly False" from EUfactcheck
A “Mostly False” from EUfactcheck, a collaboration among journalism schools that is taking on disinformation during the June elections for the European Parliament.

For most of the past decade and a half, the global fact-checking community served that role, and experienced years of rapid growth. But based on the most recent counts from the Duke Reporters’ Lab, the number of reporting and research teams that routinely intercept political lies and disinformation is plateauing — exactly at the time when the world needs fact-checkers most.

Our latest count showed as many as 457 fact-checking projects in 111 countries were active over the past two years.

But so far in 2024, that number has shrunk to 439.

The runup to June’s sprawling E.U. vote and this fall’s U.S. election campaign may nudge the numbers upward slightly. But a slight upswing is nothing like fact-checking’s rocket-like ascent over the past 15 years.

When PolitiFact won a Pulitzer Prize for its U.S. fact-checking in 2009, the award signaled the potential of this important new form of journalism. At that time, we counted 17 sites that consistently did similar work. 

By 2016, the year of the U.K’s final Brexit referendum and Donald J. Trump’s White House win, the count was 190 — an 11-fold increase. 

Four years later, during the first year of the COVID-19 pandemic and Trump’s failed reelection campaign, the revised tally for 2020 more than doubled to 421.

Number of Active Fact-checkers Per Year
Note: Because the Reporters’ Lab identifies new fact-checkers and notes inactive ones, the year-by-year counts are regularly revised based on each site’s start and stop dates.

An abrupt shift

In the years since our Lab began producing this annual census in 2014, fact-checking has become an international enterprise — quickly growing the community on six of the seven continents. Until it didn’t.

In Asia, for instance, the numbers of fact-checking operations and bureaus jumped from 94 in 2019 to 124 in 2021 — a 32% increase. And much the same happened in Africa, where 32 fact-checkers in 2019 became 54 in 2021 — up 69%.

But the pace dramatically leveled off over the following years. From 2021 to 2023, the range in Asia’s fact-checking count was between 124 and 130. In Africa, the number hovered around 55.

Number of Fact-checkers by Continent.

The leveling off started earlier in other parts of the world. From 2020 to 2023, the range in Europe was between 120 and 135.

The number in the Americas decreased. From 2020 to 2023, South America went from 44 to 39 while North America went from 94 to 90.

The flatter Earth

Our conversations with fact-checkers indicate a few factors contributed to the flattening. 

Like every industry and institution, fact-checkers had to manage their teams through the pandemic. And in the early days of COVID-19, they also had to rejigger their organizations to cover the slow-motion disaster. That meant adding health reporting to beats that more typically focused on a mix of politics, hoaxes and digital scams.

Fact-checkers, like other journalists, also have struggled to raise money to pay their operating costs.

Atop those challenges, a certain amount of turnover was common in the fact-checking business. Teams in one place would decide to close up shop, often for financial reasons. At the same time, new fact-checkers would start their own ventures.

For most of the past decade, the number of new fact-checking operations far outnumbered the ones that had shut down — often by wide margins. In one year, 2017, the ratio was 11-to-1, with 55 new fact-checkers and five that closed down.

Then there was last year: 2023 was the first time there were more departing fact-checking teams than new ones — 18 closures to 10 starts.

Net new fact-checkers per year

Roughing the refs 

Perhaps the biggest challenge some fact-checking projects face has less to do with the fundamentals of running a newsroom or a research team. It’s about the vitriol that fact-checkers encounter from hostile governments and politicians, as well as their supporters

From Bangladesh to Brazil, scurrilous attempts to undermine fact-checking are familiar tactics, especially for the reporters and researchers who work in countries where press freedom hardly exists.  

Based on data from Reporters Without Borders’ recently released World Press Freedom Index, 69 fact-checking organizations in the Reporters’ Lab list are based in about 20 countries with situations that are rated “very serious.” Starting from the most serious places, those fact-checkers operate in Syria (1), Afghanistan (2), Iran (1), China (5), Myanmar (2), Egypt (4), Iraq (2), Cuba (1), Belarus (1), Saudi Arabia (1), Bangladesh (5), Azerbaijan (1), India (26), Turkey (3), Venezuela (3), Yemen (2), Pakistan (2), Cambodia (1), Sri Lanka (5) and Sudan (1). The list from Reporters Without Borders also counts Palestine, where there are at least three fact-checking teams. Palestine is between Turkey and Venezuela in the Index.

Even in less dangerous places, politicians have enlisted companies and institutions to use economic pressure against fact-checkers.

The fact-checking community saw multiple examples in 2023. In South Korea, conservative politicians pressured the country’s leading search engine, Naver, to cancel its financial support for SNU FactCheck — a project of the Seoul National University Institute of Communications Research. Since 2017, SNU FactCheck’s director, EunRyung Chong, worked with her country’s leading media organizations to showcase fact-checking reports using a shared system for rating questionable claims — especially the claims of politicians. Naver not only dropped its financial support, but it also took away SNU FactCheck’s access to the search company’s audience. Ultimately, the European Climate Foundation, a nonpartisan organization based in the Netherlands, stepped in to provide support that allowed SNU FactCheck and its media partners to continue their work — including coverage of last month’s National Assembly election.

In Australia, conservative politicians were more successful in derailing a years-long partnership between the country’s ABC News network and RMIT University. Egged on by the conservative outlets of News Corp Australia, lawmakers attacked the reporting of RMIT ABC Fact Check based on claims of bias and misuse of public funds. ABC has since announced its plans to sever the collaboration with the university and launch a new program aimed at misinformation.

Agence France-Presse is one of the largest fact-checking organizations, with teams posted in dozens of bureaus around the world. The French news agency lists most of the participating journalists — but not all. “While we endeavour to be as transparent as possible about our staff, some countries and environments are more hostile than others when it comes to journalism,” the list notes. “For that reason, some of the journalists on our… team will not be named below to protect their safety. ”

[June 6, 2024: This article was updated to better characterize the reasons that led to the end of the RMIT ABC Fact Check partnership after seven years.]

About the Reporters’ Lab and its Census

The Duke Reporters’ Lab began tracking the international fact-checking community in 2014, when director Bill Adair organized a group of about 50 people who gathered in London for what became the first Global Fact meeting. Subsequent Global Facts led to the creation of the International Fact-Checking Network and its Code of Principles.

The Reporters’ Lab and the IFCN use similar criteria to keep track of fact-checkers, but use somewhat different methods and metrics. Here’s how we decide which fact-checkers to include in the Reporters’ Lab database and census reports. If you have questions, updates or additions, please contact Mark Stencel, Erica Ryan or Joel Luther.

Previous Fact-checking Census Reports

Note: The Reporters’ Lab regularly updates our counts as we identify and add new sites to our fact-checking database. As a result, numbers from earlier census reports differ from year to year.

Comments closed

9th Street Journal to use AI to generate local public service journalism

10th Street Journal logo box

The 9th Street Journal, a local news publication published by students in the DeWitt Wallace Center for Media & Democracy, has begun using artificial intelligence to fill gaps in local journalism.

Called The 10th Street Journal, the new project uses the power of AI to generate stories from news releases and other reliable sources. As local newsrooms have downsized, service journalism stories — articles about road construction, event announcements and trash pickups — have often been eliminated.

Utilizing AI allows for quicker and more efficient news production. Every day, The 10th Street Journal will publish a few short stories with news from Durham, such as:

  • Government actions affecting daily life, such as road closures, airport updates and activities in local parks.
  • Announcements of upcoming government board and commission meetings.
  • Community events, including festivals and celebrations.

Each story will be reviewed by a human editor for content, accuracy and style before they are published. They will carry the 10th Street byline and a disclaimer indicating they were created using AI.

Bill Adair, editor of The 9th Street Journal, and Alison Jones, managing editor, recognize the uncertainties of AI in journalism and acknowledge the need for a human touch.

“But we also believe it can be harnessed to fill a void in local journalism,” they wrote. “We are excited to be on the leading edge of this new effort.”

Comments closed

Misinformation spreads, but fact-checking has leveled off

While much of the world’s news media has struggled to find solid footing in the digital age, the number of fact-checking outlets reliably rocketed upward for years — from a mere 11 sites in 2008 to 424 in 2022.

But the long strides of the past decade and a half have slowed to a more trudging pace, despite increasing concerns around the world about the impact of manipulated media, political lies and other forms of dangerous hoaxes and rumors.

In our 10th annual fact-checking census, the Duke Reporters’ Lab counts 417 fact-checkers that are active so far in 2023, verifying and debunking misinformation in more than 100 countries and 69 languages.

While the count of fact-checkers routinely fluctuates, the current number is roughly the same as it was in 2022 and 2021.

In more blunt terms: Fact-checking’s growth seems to have leveled off.

2023 fact-checkers by year
Duke Reporters’ Lab

Since 2018, the number of fact-checking sites has grown by 47%. While that’s an increase of 135, it is far slower than the preceding five years, when the numbers grew more than two and a half times, or the six-fold increase over the five years before that.

There also are important regional patterns. With lingering public health issues, climate disasters, and Russia’s ongoing war with Ukraine, factual information is still hard to come by in important corners of the world.

Before 2020, there was a significant growth spurt among fact-checking projects in Africa, Asia, Europe and South America. At the same time, North American fact-checking began to slow. Since then, growth in the fact-checking movement has plateaued in most of the world.

Fact-checkers by continent

The Long Haul

One good sign for fact-checking is the sustainability and longevity of many key players in the field. Almost half of the fact-checking organizations in the Reporters’ Lab count have been active for five years or more. And roughly 50 of them have been active for 10 years or more.

The average lifespan of an active fact-checking site is less than six years. The average lifespan of the 139 fact-checkers that are now inactive was not quite three years.

But the baby boom has ended. Since 2019, when a bumper crop of 83 fact-checkers went online, the number of new sites each year has steadily declined. The Reporters’ Lab count for 2022 is at 20, plus three additions in 2023 as of this June. That reduced the rate of growth from three years earlier by 72%.

The number of fact-checkers that closed down in that same period also declined, but not as dramatically. That means the net count of new and departing sites has gone from 66 in 2019 to 11 in 2022, plus one addition so far in 2023.

Net New Fact-checkers

Fact-checking Starts, Stops and NetThe Downshift

As was the case for much of the world, the pandemic period certainly contributed to the slower growth. But another reason is the widespread adoption of fact-checking by journalists and other researchers from nonpartisan think tanks and good-government groups in recent years. That has broadened the number of people doing fact-checking but created less need for news organizations dedicated to the unique form of journalism.

With teams working in 108 countries, just over half of the nations represented in the United Nations have at least one organization that already produces fact-checks for digital media, newspapers, TV reports or radio. So in some countries, the audience for fact-checks might be a bit saturated. As of June, there are 71 countries that have more than one fact-checker.

Number of Fact-checkers Per Country

Another reason for the slower pace is that launching new fact-checking projects is challenging — especially in countries with repressive governments, limited press freedom and safety concerns for journalists …. In other words, places where fact-checking is most needed.

The 2023 World Press Freedom Index rates press conditions as “very serious” in 31 countries. And almost half of those countries (15 of 31) do not have any fact-checking sites. They are Bahrain, Djibouti, Eritrea, Honduras, Kuwait, Laos, Nicaragua, North Korea, Oman, Russia, Tajikistan, Turkmenistan, Vietnam and Yemen. The Index also includes Palestine, the status of which is contested.

Remarkably, there are 62 fact-checking services in the 16 other countries on the “very serious” list. And in eight of those countries, there is more than one site. India, which ranks 161 out of 180 in the World Press Freedom Index, is home to half of those 62 sites. The other countries with more than one fact-checking organization are Bangladesh, China, Venezuela, Turkey, Pakistan, Egypt and Myanmar.

In some cases, fact-checkers from those countries must do their work from other parts of the world or hide their identities to protect themselves or their families. That typically means those journalists and researchers must work anonymously or report on a country as expats from somewhere else. Sometimes both.

At least three fact-checking teams from the Middle East take those precautions: Fact-Nameh (“The Book of Facts”), which reports on Iran from Canada; Tech 4 Peace, an Iraq-focused site that also has members who work in Canada; and Syria’s Verify-Sy, whose staff includes people who operate from Turkey and Europe.

Two other examples are elTOQUE DeFacto, a project of a Cuban news website that is legally registered in Poland; and the fact-checkers at the Belarusian Investigative Center, which is based in the Czech Republic.

In other cases, existing fact-checking organizations have also established separate operations in difficult places. The Indian sites BOOM and Fact Crescendo​ have set-up fact-checking services in Bangladesh, while the French news agency Agence France-Presse (AFP) has fact-checkers that report on misinformation from Hong Kong, India and Myanmar, among others.

There still are places where fact-checking is growing, and much of that has to do with organizations that have multiple outlets and bureaus — such as AFP, as noted above. The French international news service has about 50 active sites aimed at audiences in various countries and various languages.

India-based Fact Crescendo​ launched two new channels in 2022 — one for Thailand and another focused broadly on climate issues. Along with two other outlets the previous year, Fact Crescendo now has a total of eight sites.

The 2022 midterm elections in the United States added six new local fact-checking outlets to our global tally, all at the state level. Three of the new fact-checkers were the Arizona Center for Investigative Reporting, The Nevada Independent and Wisconsin Watch, all of whom used a platform called Gigafact that helped generate quick-hit “Fact Briefs” for their audiences. But the Arizona Center is no longer participating. (For more about the 2022 U.S. elections see “From Fact Deserts to Fact Streams” — a March 2023 report from the Reporters’ Lab.)

About the Reporters’ Lab and Its Census

The Duke Reporters’ Lab began tracking the international fact-checking community in 2014, when director Bill Adair organized a group of about 50 people who gathered in London for what became the first Global Fact meeting. Subsequent Global Facts led to the creation of the International Fact-Checking Network and its Code of Principles.

The Reporters’ Lab and the IFCN use similar criteria to keep track of fact-checkers, but use somewhat different methods and metrics. Here’s how we decide which fact-checkers to include in the Reporters’ Lab database and census reports. If you have questions, updates or additions, please contact Mark Stencel, Erica Ryan or Joel Luther.

* * *

Related links: Previous fact-checking census reports

April 2014: https://reporterslab.org/duke-study-finds-fact-checking-growing-around-the-world/

January 2015: https://reporterslab.org/fact-checking-census-finds-growth-around-world/

February 2016: https://reporterslab.org/global-fact-checking-up-50-percent/

February 2017: https://reporterslab.org/international-fact-checking-gains-ground/

February 2018: https://reporterslab.org/fact-checking-triples-over-four-years/

June 2019: https://reporterslab.org/number-of-fact-checking-outlets-surges-to-188-in-more-than-60-countries/

June 2020: https://reporterslab.org/annual-census-finds-nearly-300-fact-checking-projects-around-the-world/

June 2021: https://reporterslab.org/fact-checking-census-shows-slower-growth/

June 2022: https://reporterslab.org/fact-checkers-extend-their-global-reach-with-391-outlets-but-growth-has-slowed/

Comments closed

Vast gaps in fact-checking across the U.S. allow politicians to elude scrutiny

The cover page of the report, which reads, "From Fact Deserts to Fact Streams: Expanding State and Local Fact-Checking in the U.S."
Click here to read the full report.

The candidates running last year for an open seat in Ohio’s 13th Congressional District exchanged a relentless barrage of scathing claims, counterclaims and counter-counterclaims.

Emilia Sykes was a former Democratic leader in the state legislature who came from a prominent political family. Her opponent called Sykes a lying, liberal career politician who raised her own pay, increased taxes on gas and retirement accounts, and took money from Medicare funds to “pay for free healthcare for illegals.” Other attack ads warned voters that the Democrat backed legislation that would release dangerous criminals from jail.1

Sykes’ opponent, Republican Madison Gesiotto Gilbert, was an attorney, a former Miss Ohio, and a prominent supporter of former President Donald Trump. Sykes’ and her backers called Gilbert a liar who would “push for tax cuts for millionaires” and slash Social Security and Medicare. Gilbert backed a total abortion ban with no exceptions, they warned (“not even if the rape victim is a 10 year old girl”) and she had the support of political groups that aim to “outlaw birth control.”2

Voters in one of the country’s most contested U.S. House races heard those allegations over and over — in TV ads, social media posts and from the candidates themselves.

But were any of those statements and allegations true? Who knows?

Ohio was one of 25 states where no statewide or local media outlet consistently fact-checked political statements. So voters in the 13th District were on their own to sort out the truth and the lies. 

But their experience was not unique. Throughout the country, few politicians had to worry about being held accountable for exaggerations or lies in ads or other claims during the campaign. 

An extensive review by the Duke Reporters’ Lab of candidates and races that were fact-checked found only a small percentage of politicians and public officials were held accountable for the accuracy of what they said.

The results were striking.

Governors were the most likely elected officials to face review by fact-checkers at the state and local level. But still fewer than half of the governors had even a single statement checked (19 out of 50).

For those serving in Congress, the chances of being checked were even lower. Only 33 of 435 U.S. representatives (8%) were checked. In the U.S. Senate, a mere 16 of 100 lawmakers were checked by their home state news media.

The smaller the office, the smaller the chance of being checked. Out of 7,386 state legislative seats, just 47 of those lawmakers were checked (0.6%). And among the more than 1,400 U.S. mayors of cities of 30,000 people or more, just seven were checked (0.5%).

These results build on an earlier Reporters’ Lab report3 immediately after the election, which showed vast geographical gaps in fact-checking at the state and local level. Voters in these “fact deserts” have few, if any, ways to keep up with misleading political claims on TV and social media. Nor can they easily hold public officials and institutions accountable for any inaccuracies and disinformation they spread.

A color coded map showing which U.S. states have active fact-checking outlets.

Longstanding national fact-checking projects fill in some of the gaps. FactCheck.org, PolitiFact, The Washington Post, and the Associated Press sometimes focus on high-profile races at the state and local level. They and other national media outlets also monitor the statements of prominent state-level politicians who have their eyes fixed on higher offices — such as the White House.

But our review of the 2022 election finds that the legacy fact-checking groups have not scaled to the vast size and scope of the American political system. Voters need more fact-checks, on more politicians, more quickly. And fact-checkers need to develop more robust and creative ways to distribute and showcase those findings.

We found big gaps in coverage, but also opportunities for some relatively easy collaborations. Politicians and campaigns repeatedly use the same lines and talking points. Fact-checkers sometimes cite each other’s work when the same claims pop up in other places and other mouths. But there’s relatively little organized collaboration among fact-checkers to quickly respond to recycled claims. Collaborative projects in the international fact-checking community offer potential templates. Technology investments would help, too.

Who’s Getting Fact-Checked?

To examine the state of regional fact-checking, the Duke Reporters’ Lab identified 50 active and locally focused fact-checking projects from 25 states and the District of Columbia.4 That count was little changed from the national election years since 2016, when an average of 46 fact-checking projects were active at the state and local level.

The fact-checking came from a mix of TV news stations, newspaper companies, digital media sites and services, and two public radio stations. PolitiFact’s state news affiliates also include two university partnerships, including a student newspaper. (See the full report for a complete list and descriptions.) 

Active Local Fact-Checking Outlets by Year

A bar chart showing growth in local fact-checking.

Journalists from those news organizations cranked out 976 fact-checks, verifying the accuracy of more than 1,300 claims from Jan. 1, 2022, to Election Day. 

But thousands more claims went unchecked. That became clear when we began to determine who was getting fact-checked.

As part of our research, we reviewed the fact-checkers’ output in text, video and audio format. We identified a “claim” as a statement or image that served as the basis of a news report that analyzed its accuracy based on reliable evidence. That included a mix of political statements as well as other kinds of fact-checks — such as local issues, social trends and health concerns.

We excluded explanatory stories that did not analyze a specific claim or reach a conclusion. Of the more than 970 fact-checks we reviewed, about 13% examined multiple claims.

The Reporters’ Lab found that a vast majority of politicians at the state and local level elude the fact-checking process, from city council to statewide office. But elected officials and candidates in some places got more scrutiny than others. 

Some interesting findings:

The most-checked politician was Iowa Gov. Kim Reynolds, a Republican. Reynolds topped the list with 28 claims checked, largely because of two in-depth articles from the Gazette Fact Checker in Cedar Rapids, which covered 10 claims from her Condition of the State address in January 2022, and another 10 from her delivery of the Republican response to President Joe Biden’s State of the Union in March.

Other more frequently checked politicians included Michigan gubernatorial challenger Tudor Dixon, a Republican (18); Cindy Axne, a Democrat who lost her bid for reelection to a U.S. House seat in Iowa (16); and incumbent U.S. Sen. Ron Johnson of Wisconsin, a Republican (16).

Also near the top of the list were former President Trump, a Republican (15), who was sometimes checked on claims during local appearances; Michigan Gov. Gretchen Whitmer, a Democrat (15); Wisconsin Gov. Tony Evers, a Democrat (15); Evers’ Republican challenger Tim Michels (14); Arizona gubernatorial candidate Kari Lake, a Republican (13); and Florida’s Republican Gov. Ron DeSantis (12).

Most-Checked Politicians

A bar chart showing which politicians are most fact-checked.

Overall, individual claims by sitting governors were checked 130 times (10% of claims); by U.S. representatives 96 times (7%); by state legislators 77 times (6%); by U.S. senators 61 times (5%); and by mayors 11 times (1%).

Most-Checked Politicians By Office Held

A bar chart showing the distribution of fact-checks by office held.

For comparison, President Joe Biden’s claims were checked more than 100 times by national fact-checkers from PolitiFact, The Washington Post and others.

While these numbers focus on direct checking of the politicians themselves, fact-checkers also analyzed claims by other partisan sources, including deep-pocket political organizations running attack ads in many races.

There was more checking of Republicans/conservative politicians and political groups (553 claims, or 42%) than Democratic/progressive groups (382 claims, or 29%). If we look strictly at the 942 claims from claimants we identified as political, 59% were Republican/conservative and 41% were Democratic/progressive. 


The cover page of the report, which reads, "From Fact Deserts to Fact Streams: Expanding State and Local Fact-Checking in the U.S."Read the full report here.

Our Recommendations

Fact-checking is a challenging type of journalism. It requires speed, meticulous research and a thick skin. It also requires a willingness to call things as they are, instead of hiding behind the misleading niceties of both-siderism. And yet, over the past decade, dozens of state and local news organizations have adopted this new type of journalism. 

The 50 fact-checking programs we examined during last year’s midterm election invested time, energy and money to combat political falsehoods and push back against other types of misinformation. Even at a time of upheaval in the local news business, we have seen TV news stations, newspaper companies, and nonprofit newsrooms embrace this mission.

But all this work is not enough. 

Misinformation and disinformation spread far, fast and at a scale that is almost impossible for news media fact-checkers to keep pace. If journalists aim to reestablish a common set of facts, we need to do more fact-checking. 

Our recommendations for dramatically increasing local media’s capacity for fact-checking include: 

Invest in more fact-checking 

The challenge: Despite the diligent work of local fact-checking outlets in 25 states and the District of Columbia, only a relative handful of politicians and public officials were ever fact-checked. And in half the country, there was no active fact-checking at all.

The recommendation: It is clear that an investment in this vital journalism is sorely needed. Voters in “fact desert” states like Ohio and New Hampshire will be key to the 2024 elections. And those voters should be able to trust in local journalism to provide a check on the lies that politicians are sure to peddle in political ads, debates and other campaign events.

Even in states where local fact-checking efforts exist, they are severely outmatched by a tsunami of claims, as political organizations pump billions of dollars into campaign ads, and social media messages accelerate the spread of misinformation far and wide. The low numbers of claims checked locally in the 2022 Senate races in Arizona, Georgia, Nevada and Pennsylvania demonstrate that additional help is needed in manpower and financial resources for the journalists trying to keep up with the campaign cycle.

One way to increase the volume of local fact-checking would be to incentivize projects like Gigafact and PolitiFact. These existing models can be replicated by other organizations and added in additional states. The Gigafact partners in Arizona, Nevada and Wisconsin produced dozens of 140-word “fact briefs” in the run-up to the 2022 election. These structured fact-checks, which answer yes/no questions, have proved popular with audiences. Dee J. Hall, managing editor at Wisconsin Watch, which participated in the Gigafact pilot in 2022, reported that eight of the organization’s ten most popular stories in November were fact briefs.

The journalism education community can also help. During the 2022 election, PolitiFact worked with the journalism department at West Virginia University and the student newspaper at the University of Iowa to produce fact-checks for voters in their states. Expanding that model, potentially in collaboration with other national fact-checkers, could transform most of the barren “fact deserts” we’ve described in time for the 2024 general election campaign. 

Elevate fact-checking

The challenge: Fact-checking is still a niche form of reporting. It shares DNA with explanatory and investigative journalism. But it is rarely discussed at major news media conferences. There are few forums for fact-checkers at the state and local level to compare their efforts, learn from one another and focus on their distinctive reporting problems. 

The recommendation: As we continue increasing the volume of local fact-checking, audiences and potential funders need to view fact-checking with the same importance as investigative work. Investigative reporting has been a cornerstone of local news outlets’ identity and public service mission for decades. Fact-checking should be equally revered. Both are vital forms of journalism that are closely related to each other.

Some local news outlets already take this approach, with their investigative teams also producing fact-checking of claims. For example, 4 Investigates Fact Check at KOB-TV in New Mexico is an offshoot of its 4 Investigates team, and FactFinder 12 Fact Check at KWCH-TV in Kansas uses a similar model.

Fact-checkers also can elevate their work by explaining it more forcefully — on-air, online and even in person. This is an essential way to promote trust in their work. We found that 17 state and local fact-checking efforts do not provide any explanation of their process or methodology to their audiences. Offering this kind of basic guidance does not require creating and maintaining separate dedicated “about” or methodology pages. Instead, some fact-checkers, such as ConnectiFact and the Gazette Fact Checker in Iowa, embed explanations directly within their fact-checks. In this mobile era, that in-line approach might well be more important. Likewise, as TV continues to play an increasing role in fact-checking, broadcasters also need to help their viewers understand what they’re seeing.

Embrace technology and collaborate

The challenge: Several national fact-checkers in the United States work closely together with the Reporters’ Lab, as well as other academic researchers and independent developers, to test new approaches to their work. We’ve seen that same spirit of community in the International Fact-Checking Network at the Poynter Institute, which has fostered cross-border collaborations and technology initiatives. In contrast, few state and local press in the United States have the capacity or technological know-how to experiment on their own. Fact-checking also has a low-profile in journalism’s investigative and tech circles.

The recommendation: There is a critical need for more investment in technology to assist fact-checkers at the state and local level. As bad actors push misinformation on social media and politicians take advantage of new technologies to mislead voters, an equal effort must be made to boost the truth.

AI can be leveraged to better track the spread of misinformation, such as catching repetitions of false talking points that catch on and circulate all around the country. A talking point tracker could help fact-checkers prioritize and respond to false claims that have already been fact-checked.

AI can also be leveraged to help with the debunking of false claims. Once a repeated talking point has been identified, a system using AI could then create the building blocks of a fact-check that a journalist could review and publish.

But none of these ideas will get very far unless journalists are willing to collaborate. Collaboration can cut down on duplication and allow more effort to be spent on fact-checking new claims. The use of technology would also have a greater impact if more organizations are willing to swap data and make use of each others’ research.

Make fact-checking easier to find

The challenge: Fact-checking in the United States has grown significantly since 2017. But fact-checks are still easy to miss on cluttered digital news feeds. Existing technology can help fact-checkers raise their profiles. But some state and local fact-checks don’t even have basic features that call attention to their reporting.

The recommendation: Nearly 180 fact-checking projects across the United States and  around the world have embraced open-source systems designed to provide data that elevate their work in search results and on large social media and messaging services. State and local fact-checkers should adopt this system as well.

The Reporters’ Lab joined with Google and Schema.org to develop a tagging system called ClaimReview. ClaimReview provides data that major digital platforms can use to recognize and suppress misinformation on their feeds. A second, related schema called MediaReview is generating similar data for visual misinformation. 

ClaimReview has helped feed a prominent collection of recent fact-checks on the front of the Google News page in half a dozen countries, including the U.S. But so far, most state and local fact-checking projects are not using ClaimReview. 

Meanwhile, the regional fact-checkers have even more foundational work to do. That more than a quarter of the active fact-checkers (13 of 50) have no dedicated page or tag for the public to find these stories is disappointing. Overcoming the limitations of inflexible publishing systems often make simple things hard. But all fact-checkers need to do more to showcase their work. Fact-checks have a long shelf life and enormous value to their audiences. 


This project was a team effort. The report was written and led by Mark Stencel, co-director of the Duke Reporters’ Lab, and project manager Erica Ryan. Student researchers Sofia Bliss-Carrascosa and Belén Bricchi were significant contributors, as was Joel Luther, research and outreach coordinator for ClaimReview and MediaReview at the Reporters’ Lab.

Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. If you know of a fact-checking project that has been missed, please contact  and  at the Reporters’ Lab.

Our thanks to Knight Foundation’s journalism program for supporting this research.

Disclosure: Stencel is an unpaid contributing editor to PolitiFact North Carolina.


1 https://nrcc.org/2022/08/31/fact-check-sykes-lies-to-oh-voters-in-first-tv-ad/

https://www.youtube.com/watch?v=pCQz6FCMMzo

https://congressionalleadershipfund.org/sykes-sided-with-criminals-over-public-safety/

2 https://host2.adimpact.com/admo/viewer/a9400662-bc20-4e34-9a44-42d478efa451/

https://dccc.org/dccc-releases-new-tv-ad-in-oh-13-wrong/

3 https://reporterslab.org/fact-deserts-leave-states-vulnerable-to-election-lies/

4 After an earlier report in November 2022, our Lab identified a few more election-year fact-checking efforts. That meant our total count for the year increased from 46 to 50. And the number of states that had fact-checking efforts in that period increased from 21 to 25.

 

Comments closed

Fact-checkers extend their global reach with 391 outlets, but growth has slowed

The number of fact-checkers around the world doubled over the past six years, with nearly 400 teams of journalists and researchers taking on political lies, hoaxes and other forms of misinformation in 105 countries.

The Duke Reporters’ Lab annual fact-checking census counted 391 fact-checking projects that were active in 2021. Of those, 378 are operating now.

That’s up from a revised count of 186 active sites in 2016 – the year when the Brexit vote and the U.S. presidential election elevated global concerns about the spread of inaccurate information and rumors, especially in digital media. Misleading posts about ethnic conflicts, wars, the climate and the pandemic only amplified those worries in the years since.

Since last year’s census, we have added 51 sites to our global fact-checking map and database. In that same 12 months, another seven fact-checkers closed down.

While this vital journalism now appears in at least 69 languages on six continents, the pace of growth in the international fact-checking community has slowed over the past several years.

The largest growth was in 2019, when 77 new fact-checking sites and organizations made their debut. Based on our updated counts since then, the number was 58 in 2020 and 22 last year. 

New Fact Checkers by Year

New Fact Checkers by Year
Duke Reporters’ Lab

(Note: The adjusted number of 2021 launches may increase over time as the Reporters’ Lab identifies other fact-checkers we have not yet discovered.)

These numbers may be a worrisome trend, or they could mean that the growth of the past several years has saturated the market – or paused in the wake of the global pandemic. But we also expect our numbers for last year to eventually increase as we continue to identify other fact-checkers, as happens every year.

More than a third of the growth since 2019’s bumper crop came from existing fact-checking operations that added new outlets to expand their reach to new places and different audiences. That includes Agence France-Presse, the French international news service, which launched at least 17 new sites in that period. In Africa, Dubawa and PesaCheck opened nine new bureaus, while Asia’s Boom and Fact Crescendo opened five. In addition, Delfi and Pagella Politica in Europe and PolitiFact in North America each launched a new satellite, too.

Fact-checking has expanded quickly over the years in Latin America, but less so of late. Since 2019 we saw three launches in South America (one of which has folded) plus one more focused on Cuba. 

Active Fact-Checkers by Year

Active Fact-Checkers by Year
Duke Reporters’ Lab

The Reporters’ Lab is monitoring another trend: fact-checkers’ use of rating systems. These ratings are designed to succinctly summarize a fact-checker’s conclusions about political statements and other forms of potential misinformation. When we analyzed the use of these features in past reports, we found that about 80-90% of the fact-checkers we looked at relied on these meters and standardized labels to prominently convey their findings.

But that approach appears to be less common among newer fact-checkers. Our initial review of the fact-checkers that launched in 2020 found that less than half seemed to be using rating systems. And among the Class of 2021, only a third seemed to rely on predefined ratings. 

We also have seen established fact-checkers change their approach in handling ratings.

fact-checking meters
Examples of fact-checking meters from Público’s Prova dos Factos in Portugal, the Fact Investigation Platform’s Factometer in Armenia, OhmyFact from South Korea’s OhmyNews, and Nepal Fact Check from the Center for Media Research-Nepal.

The Norwegian fact-checking site Faktisk, for instance, launched in 2017 with a five-point, color-coded rating system that was similar to ones used by most of the fact-checkers we monitor: “helt sant” (for “absolutely true” in green) to “helt feil” (“completely false” in red). But during a recent redesign, Faktisk phased out its ratings. 

“The decision to move away from the traditional scale was hard and subject to a very long discussion and consideration within the team,” said editor-in-chief Kristoffer Egeberg in an email. “Many felt that a rigid system where conclusions had to ‘fit the glove’ became kind of a straitjacket, causing us to either drop claims that weren’t precise enough or too complex to fit into one fixed conclusion, or to instead of doing the fact-check – simply write a fact-story instead, where a rating was not needed.”

Egeberg also noted that sometimes “the color of the ratings became the main focus rather than the claim and conclusion itself, derailing the important discussion about the facts.”

We plan to examine this trend in the future and expect this discussion may emerge during the conversations at the annual Global Fact summit in Oslo, Norway, next week. 

The Duke Reporters’ Lab began keeping track of the international fact-checking community in 2014, when it organized a group of about 50 people who gathered in London for what became the first Global Fact meeting. This year about 10-times that many people – 500 journalists, technologists, truth advocates and academics – are expected to attend the ninth summit. The conferences are now organized by the International Fact-Checking Network, based at the Poynter Institute in St. Petersburg, Florida. This will be the group’s first large in-person meeting in three years.

Fact-Checkers by Continent

Fact-Checkers by Continent
Duke Reporters’ Lab

Like their audiences, the fact-checkers are a multilingual community, and many of these sites publish their findings in multiple languages, either on the same site or in some cases alternate sites. English is the most common, used on at least 166 sites, followed by Spanish (55), French (36), Arabic (14), Portuguese (13), Korean (13), German (12) and Hindi (11).

Nearly two-third of the fact-checkers are affiliated with media organizations (226 out of 378, or about 60%). But there are other affiliations and business models too, including 24 with academic ties and 45 that are part of a larger nonprofit or non-governmental organization. Some of these fact-checkers have overlapping arrangements with multiple organizations. More than a fifth of the community (86 out of 378) operate independently.

About the census: 

Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for earlier years in this report are higher than the counts the Lab included in earlier reports. If you have questions, updates or additions, please contact Mark Stencel, Erica Ryan or Joel Luther.

Related links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018

June 2019

June 2020

June 2021

Comments closed

MediaReview: A next step in solving the misinformation crisis

When a 2019 video went viral after being edited to make House Speaker Nancy Pelosi look inebriated, it took 32 hours for one of Facebook’s independent fact-checking partners to rate the clip false. By then, the video had amassed 2.2 million views, 45,000 shares, and 23,000 comments – many of them calling her “drunk” or “a babbling mess.”

The year before, the Trump White House circulated a video that was edited to make CNN’s Jim Acosta appear to aggressively react to a mic-wielding intern during a presidential press conference.

A string of high-profile misleading videos like these in the run-up to the 2020 U.S. election stoked long-feared concerns about skillfully manipulated videos, sometimes using AI. The main worry then was how fast these doctored videos would become the next battleground in a global war against misinformation. But new research by the Duke Reporters’ Lab and a group of participating fact-checking organizations in 22 countries found that other, far less sophisticated forms of media manipulation were much more prevalent.

By using a unified tagging system called MediaReview, the Reporters’ Lab and 43 fact-checking partners collected and categorized more than 1,000 fact-checks based on manipulated media content. Those accumulated fact-checks revealed that:

  • While we began this process in 2019 expecting deepfakes and other sophisticated media manipulation tactics to be the most imminent threat, we’ve predominantly seen low-budget “cheap fakes.” The vast majority of media-based misinformation is rated “Missing Context,” or, as we’ve defined it, “presenting unaltered media in an inaccurate manner.” In total, fact-checkers have applied the Missing Context rating to 56% of the MediaReview entries they’ve created.
  • Most of the fact-checks in our dataset, 78%, come from content on Meta’s platforms Facebook and Instagram, likely driven by the company’s well-funded Third-Party Fact Checking-Program. These platforms are also more likely to label or remove fact-checked content. More than 80% of fact-checked posts on Instagram and Facebook are either labeled to add context or no longer on the platform. In contrast, more than 60% of fact-checked posts on YouTube and Twitter remain intact, without labeling to indicate their accuracy.
  • Without reliable tools for archiving manipulated material that is removed or deleted, it is challenging for fact-checkers to track trends and bad actors. Fact-checkers used a variety of tools, such as the Internet Archive’s Wayback Machine, to attempt to capture this ephemeral misinformation; but only 67% of submitted archive links were viewable on the chosen archive when accessed at a later date, while 33% were not.

The Reporters’ Lab research also demonstrated MediaReview’s potential — especially based on the willingness and enthusiastic participation of the fact-checking community. With the right incentives for participating fact-checkers, MediaReview provides efficient new ways to help intercept manipulated media content — in large part because so many variations of the same claims appear repeatedly around the world, as the pandemic has continuously demonstrated.

The Reporters’ Lab began developing the MediaReview tagging system around the time of the Pelosi video, when Google and Facebook separately asked the Duke team to explore possible tools to fight the looming media misinformation crisis.

MediaReview is a sibling to ClaimReview, an initiative the Reporters’ Lab led starting in 2015, that sought to create infrastructure for fact-checkers to make their articles machine-readable and easily used for search engines, mobile apps, and other projects. Called “one of the most successful ‘structured journalism’ projects ever launched,” the ClaimReview schema has proven immensely valuable. Used by 177 fact-checking organizations around the world, ClaimReview has been used to tag 136,744 articles, establishing a large and valuable corpus of fact-checks: tens of thousands of statements from politicians and social media accounts around the world analyzed and rated by independent journalists. 

But ClaimReview proved insufficient to address the new, specific challenges presented by misinformation spread through multimedia. Thus, in September 2019, the Duke Reporters’ Lab began working with the major search engines, social media services, fact-checkers and other interested stakeholders on an open process to develop MediaReview, a new sibling of ClaimReview that creates a standard for manipulated video and images. Throughout pre-launch testing phases, 43 fact-checking outlets have used MediaReview to tag 1,156 images and videos, again providing valuable, structured information about whether pieces of content are legitimate and how they may have been manipulated.

In an age of misinformation, MediaReview, like ClaimReview before it, offers something vital: real-time data on which pieces of media are truthful and which ones are not, as verified by the world’s fact-checking journalists. 

But the work of MediaReview is not done. New fact-checkers must be brought on board in order to reflect the diversity and global reach of the fact-checking community, the major search and social media services must incentivize the creation and proper use of MediaReview, and more of those tech platforms and other researchers need to learn about, and make full use of, the opportunities this new tagging system can provide.

An Open Process

MediaReview is the product of a two-year international effort to get input from the fact-checking community and other stakeholders. It was first adapted from a guide to manipulated video published by The Washington Post, which was initially presented at a Duke Tech & Check meeting in the spring of 2019. The Reporters’ Lab worked with Facebook, Google, YouTube, Schema.org, the International Fact-Checking Network, and The Washington Post to expand this guide to include a similar taxonomy for manipulated images. 

The global fact-checking community has been intimately involved in the process of developing MediaReview. Since the beginning of the process, the Reporters’ Lab has shared all working drafts with fact-checkers and has solicited feedback and comments at every step. We and our partners have also presented to the fact-checking community several times, including at the Trusted Media Summit in 2019, a fact-checkers’ community meeting in 2020, Global Fact 7 in 2020, Global Fact 8 in 2021 and several open “office hours” sessions with the sole intent of gathering feedback.

Throughout development and testing, the Reporters’ Lab held extensive technical discussions with Schema.org to properly validate the proposed structure and terminology of MediaReview, and solicited additional feedback from third-party organizations working in similar spaces, including the Partnership on AI, Witness, Meedan and Storyful.

Analysis of the First 1,156

As of February 1, 2022, fact-checkers from 43 outlets spanning 22 countries have now made 1,156 MediaReview entries.

Number of outlets creating MediaReview by country

Number of outlets creating MediaReview by country.

Number of MediaReview entries created by outlet

Number of MediaReview entries created by outlet.

Our biggest lesson in reviewing these entries: The way misinformation is conveyed most often through multimedia is not what we expected. We began this process in 2019 expecting deepfakes and other sophisticated media manipulation tactics to be an imminent threat, but we’ve predominantly seen low-budget “cheap fakes.” What we’ve seen consistently throughout testing is that the vast majority of media-based misinformation is rated “Missing Context,” or, as we’ve defined it, “presenting unaltered media in an inaccurate manner.” In total, fact-checkers have applied the Missing Context rating to 56% of the MediaReview entries they’ve created.

The “Original” rating has been the second most applied, accounting for 20% of the MediaReview entries created. As we’ve heard from fact-checkers through our open feedback process, a substantial portion of the media being fact-checked is not manipulated at all; rather, it consists of original videos of people making false claims. Going forward, we know we need to be clear about the use of the “Original” rating as we help more fact-checkers get started with MediaReview, and we need to continue to emphasize the use of ClaimReview to counter the false claims contained in these kinds of videos.

Throughout the testing process, the Duke Reporters’ Lab has monitored incoming MediaReview entries and provided feedback to fact-checkers where applicable. We’ve heard from fact-checkers that that feedback was valuable and helped clarify the rating system. 

Reviewing media links that have been checked by third-party fact-checkers, a vast majority of fact-checked media thus far exists on Facebook:

Share of links in the MediaReview dataset by platform.

Share of links in the MediaReview dataset by platform.

Facebook’s well-funded Third Party Fact-Checking Program likely contributes to this rate; fact-checkers are paid directly to check content on Facebook’s platforms, making that content more prevalent in our dataset.

We also reviewed the current status of links checked by fact-checkers and tagged with MediaReview. With different platforms having different policies on how they deal with misinformation, some of the original posts are intact, others have been removed by either the platform or the user, and some have a context label appended with additional fact-check information. By platform, Instagram is the most likely to append additional information, while YouTube is the most likely to present fact-checked content in its original, intact form, not annotated with any fact-checking information: 72.5% of the media checked from YouTube are still available in their original format on the platform.

Status of fact-checked media broken down by platform, showing the percentage of checked media either labeled with additional context, removed, or presented fully intact.

Status of fact-checked media broken down by platform, showing the percentage of checked media either labeled with additional context, removed, or presented fully intact.

In addition, we noted that fact-checkers have often (roughly 25% of the time) input an archival link into the “Media URL” field, in an attempt to capture the link for the video or image, ephemeral misinformation that is often quickly deleted by either the platforms or the users. Notably, though, these existing archive systems are unreliable; only 67% of submitted archive links were viewable on the archive, while 33% were not. While we found that Perma.cc was the most reliable existing archiving system used by fact-checkers, it only successfully presented 80% of checked media, and its status as a paid archival tool leaves an opportunity to build a new system to preserve fact-checked media.

Success rate of archival tools used by fact-checkers in properly displaying the fact-checked media.

Success rate of archival tools used by fact-checkers in properly displaying the fact-checked media.

Next Steps

Putting MediaReview to use: Fact-checkers have emphasized to us the need for social media companies and search engines platforms to make use of these new signals. They’ve highlighted that usability testing would help ensure that MediaReview data was seen prominently on the tech platforms. 

Archiving the images and videos: As noted above, current archiving systems are insufficient to capture the media misinformation fact-checkers are reporting on. Currently, fact-checkers using MediaReview are limited to quoting or describing the video or image they checked and including the URL where they discovered it. There’s no easy, consistent workflow for preserving the content itself. Manipulated images and videos are often removed by social media platforms or deleted or altered by their owners, leaving no record of how they were manipulated or presented out of context. In addition, if the same video or image emerges again in the future, it can be difficult to determine if it has been previously fact-checked. A repository of this content — which could be saved automatically as part of each MediaReview submission — would allow for accessibility and long-term durability for archiving, research, and more rapid detection of misleading images and video. 

Making more: We continue to believe that fact-checkers need incentives to continue making this data. The more fact-checkers use these schemas, the more we increase our understanding of the patterns and spread of misinformation around the world — and the ability to intercept inaccurate and sometimes dangerous content. The effort required to produce ClaimReview or MediaReview is relatively low, but adds up cumulatively — especially for smaller teams with limited technological resources. 

While fact-checkers created the first 1,156 entries solely to help the community refine and test the schema, further use by the fact-checkers must be encouraged by the tech platforms’ willingness to adopt and utilize the data. Currently, 31% of the links in our MediaReview dataset are still fully intact where they were first posted; they have not been removed or had any additional context added. Fact-checkers have displayed their eagerness to research manipulated media, publish detailed articles assessing their veracity, and make their assessments available to the platforms to help curb the tide of misinformation. Search engines and social media companies must now decide to use and display these signals.

Appendix: MediaReview Development Timeline

MediaReview is the product of a two-year international effort involving the Duke Reporters’ Lab, the fact-checking community, the tech platforms and other stakeholders. 

Mar 28, 2019

Phoebe Connelly and Nadine Ajaka of The Washington Post first presented their idea for a taxonomy classifying manipulated video at a Duke Tech & Check meeting. 

Sep 17, 2019

The Reporters’ Lab met with Facebook, Google, YouTube, Schema.org, the International Fact-Checking Network, and The Washington Post in New York to plan to expand this guide to include a similar taxonomy for manipulated images. 

Oct 17, 2019

The Reporters’ Lab emailed a first draft of the new taxonomy to all signatories of the IFCN’s Code of Principles and asked for comments.

Nov 26, 2019

After incorporating suggestions from the first draft document and generating a proposal for Schema.org, we began to test MediaReview for a selection of fact-checks of images and videos. Our internal testing helped refine the draft of the Schema proposal, and we shared an updated version with IFCN signatories on November 26.

Jan 30, 2020

The Duke Reporters’ Lab, IFCN and Google hosted a Fact-Checkers Community Meeting at the offices of The Washington Post. Forty-six people, representing 21 fact-checking outlets and 15 countries, attended. We presented slides about MediaReview, asked fact-checkers to test the creation process on their own, and again asked for feedback from those in attendance.

Apr 16, 2020

The Reporters’ Lab began a testing process with three of the most prominent fact-checkers in the United States: FactCheck.org, PolitiFact, and The Washington Post. We have publicly shared their test MediaReview entries, now totaling 421, throughout the testing process.

Jun 1, 2020

We wrote and circulated a document summarizing the remaining development issues with MediaReview, including new issues we had discovered through our first phase of testing. We also proposed new Media Types for “image macro” and “audio,” and new associated ratings, and circulated those in a document as well. We published links to both of these documents on the Reporters’ Lab site (We want your feedback on the MediaReview tagging system) and published a short explainer detailing the basics of MediaReview (What is MediaReview?)

Jun 23, 2020

We again presented on MediaReview at Global Fact 7 in June 2020, detailing our efforts so far and again asking for feedback on our new proposed media types and ratings and our Feedback and Discussion document. The YouTube video of that session has been viewed over 500 times, by fact-checkers around the globe, and dozens participated in the live chat. 

Apr 1, 2021

We hosted another session on MediaReview for IFCN signatories on April 1, 2021, again seeking feedback and updating fact-checkers on our plans to further test the Schema proposal.

Jun 3, 2021

In June 2021, the Reporters’ Lab worked with Google to add MediaReview fields to the Fact Check Markup Tool and expand testing to a global userbase. We regularly monitored MediaReview and maintained regular communication with fact-checkers who were testing the new schema.

Nov 10, 2021

We held an open feedback session with fact-checkers on November 10, 2021, providing the community another chance to refine the schema. Overall, fact-checkers have told us that they’re pleased with the process of creating MediaReview and that its similarity to ClaimReview makes it easy to use. As of February 1, 2022, fact-checkers have made a total of 1,156 MediaReview entries. 

For more information about MediaReview, contact Joel Luther.

Comments closed

Reporters’ Lab Takes Part in Eighth ‘Global Fact’ Summit

The Duke Reporters’ Lab spent this year’s eighth Global Fact conference helping the world’s fact-checkers learn more about tagging systems that can extend the reach of their work; encouraging a sense of community among organizations around the globe; and discussing new research that offers potent insights into how fact-checkers do their jobs.

This year’s Global Fact took place virtually for the second time, following years of meeting in person all around the world, in cities such as London, Buenos Aires, Madrid, Rome, and Cape Town. More than 1,000 fact-checkers, academic researchers, industry experts, and representatives from technology companies attended the virtual conference.

Over three days, the Reporters’ Lab team participated in five conference sessions and hosted a daily virtual networking table.

  • Reporters’ Lab director and IFCN co-founder Bill Adair delivered opening remarks for the conference, focused on how fact-checkers around the world have closely collaborated in recent years.
  • Mark Stencel, co-director of the Reporters’ Lab, moderated the featured talk with Tom Rosenstiel, the Eleanor Merrill Visiting Professor on the Future of Journalism at the Philip Merrill College of Journalism at the University of Maryland and coauthor of The Elements of Journalism. Rosenstiel previously served as executive director of the American Press Institute. He discussed research into how the public responds to the core values of journalism and how fact-checkers might be able to build more trust with their audience.
  • Thomas Van Damme presented findings from his master’s thesis, “Global Trends in Fact-Checking: A Data-Driven Analysis of ClaimReview,” during a panel discussion moderated by Lucas Graves and featuring Joel Luther of the Reporters’ Lab and Karen Rebelo, a fact-checker from BOOM in India. Van Damme’s analysis reveals fascinating trends from five years of ClaimReview data and demonstrates ClaimReview’s usefulness for academic research.
  • Luther also prepared two pre-recorded webinars that were available throughout the conference:

In addition, the Reporters’ Lab is excited to reconnect with fact-checkers again at 8 a.m. Eastern on Wednesday, November 10, for a feedback session on MediaReview. We’re pleased to report that fact-checkers have now used MediaReview to tag their fact-checks of images and videos 841 times, and we’re eager to hear any additional feedback and continue the open development process we began in 2019 in close collaboration with the IFCN.

Comments closed

MediaReview Testing Expands to a Global Userbase

The Duke Reporters’ Lab is launching the next phase of development of MediaReview, a tagging system that fact-checkers can use to identify whether a video or image has been manipulated.

Conceived in late 2019, MediaReview is a sibling to ClaimReview, which allows fact-checkers to clearly label their articles for search engines and social media platforms. The Reporters’ Lab has led an open development process, consulting with tech platforms like Google, YouTube and Facebook, and with fact-checkers around the world.

Testing of MediaReview began in April 2020 with the Lab’s FactStream partners: PolitiFact, FactCheck.org and The Washington Post. Since then, fact-checkers from those three outlets have logged more than 300 examples of MediaReview for their fact-checks of images and videos.

We’re ready to expand testing to a global audience and we’re pleased to announce that fact-checkers can now add MediaReview to their fact-checks through Google’s Fact Check Markup Tool, a tool which many of the world’s fact-checkers currently use to create ClaimReview. This will bring MediaReview testing to more fact-checkers around the world, the next step in the open process that will lead to a more refined final product.

ClaimReview was developed through a partnership of the Reporters’ Lab, Google, Jigsaw, and Schema.org. It provides a standard way for publishers of fact-checks to identify the claim being checked, the person or entity that made the claim, and the conclusion of the article. This standardization enables search engines and other platforms to highlight fact-checks, and can power automated products such as the FactStream and Squash apps being developed in the Reporters’ Lab.

Likewise, MediaReview aims to standardize the way fact-checkers talk about manipulated media. The goal is twofold: to allow fact-checkers to provide information to the tech platforms that a piece of media has been manipulated, and to establish a common vocabulary to describe types of media manipulation. By communicating clearly in consistent ways, independent fact-checkers can play an important role in informing people around the world.

The Duke Reporters’ Lab has led the open process to develop MediaReview, and we are eager to help fact-checkers get started with testing it. Contact Joel Luther for questions or to set up a training session. International Fact-Checking Network signatories who have questions about the process can contact the IFCN.

For more information, see the new MediaReview section of our ClaimReview Project website.

Comments closed

Fact-checking census shows slower growth

Fact-checkers are now found in at least 102 countries – more than half the nations in the world. 

The latest census by the Duke Reporters’ Lab identified 341 active fact-checking projects, up 51 from last June’s report.

But after years of steady and sometimes rapid growth, there are signs that trend is slowing, even though misleading content and political lies have played a growing role in contentious elections and the global response to the coronavirus pandemic.

Our tally revealed a slowdown in the number of new fact-checkers, especially when we looked at the upward trajectory of projects since the Lab began its yearly survey and global fact-checking map seven years ago. 

The number of fact-checking projects that launched since the most recent Reporters’ Lab census was more than three times fewer than the number that started in the 12 months before that, based on our adjusted tally. 

From July 2019 to June 2020, there were 61 new fact-checkers. In the year since then, there were 19.

Meanwhile, 21 fact-checkers shut down in that same two-year period beginning in June 2019. And 54 additions to the Duke database in that same period were fact-checkers that were already up and running prior to the 2019 census.

Looking at the count by calendar year also underscored the slowdown in the time of COVID. 

The Reporters’ Lab counted 36 fact-checking projects that launched in 2020. That was below the annual average of 53 for the preceding six calendar years – and less than half the number of startups that began fact-checking in 2019. The 2020 launches were also the lowest number of new fact-checkers we’ve counted since 2014. 

New Fact Checkers by Year

New Fact Checkers by Year
Duke Reporters’ Lab

(Note: The adjusted number of 2020 launches may increase slightly over time as the Reporters’ Lab identifies other fact-checkers we have not yet discovered.)

The slowdown comes after a period of rapid expansion that began in 2016. That was the year when the Brexit vote in the United Kingdom and the presidential race in the United States raised public alarm about the impact of misinformation.

In response, major tech companies such as Facebook and Google elevated fact-checks on their platforms and provided grants, direct funding and other incentives for new and existing fact-checking organizations. (Disclosure: Google and Facebook fund some of the Duke lab’s research on technologies for fact-checkers. )

The 2018-2020 numbers presented below are adjusted from earlier census reports to include fact-checkers that were subsequently added to our database. 

Active Fact-Checkers by Year

2021 Fact-Checking Census
Duke Reporters’ Lab

Note: 2021 YTD includes one fact-checker that closed in 2021. 

Growth has been steady on almost every continent except in North America. In the United States, where fact-checking first took off in the early 2010s, there are 61 active fact-checkers now. That’s down slightly from the 2020 election year, when there were 66. But the U.S. is still home to more fact-checking projects than any other country. Of the current U.S. fact-checkers, more than half (35 of 61) focus on state and local politics. 

Fact-Checkers by Continent

Fact-Checkers by Continent
Duke Reporters’ Lab

Among other details we found in this year’s census:

  • More countries, more staying power: Based on our adjusted count, fact-checkers were active in at least 47 countries in 2014. That more than doubled to 102 now. And most of the fact-checkers that started in 2014 or earlier (71 out of 122) are still active today.

 

  • Fact-checking is more multilingual: The active fact-checkers produce reports in nearly 70 languages, from Albanian to Urdu. English is the most common, used on 146 different sites, followed by Spanish (53), French (33), Arabic (14), Portuguese (12), Korean (11) and German (10). Fact-checkers in multilingual countries often present their work in more than one language – either in translation on the same site, or on different sites tailored for specific language communities, including original reporting for those audiences.

 

  • More than media: Half of the current fact-checkers (195 of 341) are affiliated with media organizations, including national news publishers and broadcasters, local news sources and digital-only outlets. But there are other models, too. At least 37 are affiliated with non-profit groups, think tanks and nongovernmental organizations and 26 are affiliated academic institutions. Some of the fact-checkers involve cross-organization partnerships and have multiple affiliations. But to be listed in our database, the fact-checking must be organized and produced in a journalistic fashion.

 

  • Turnover: In addition to the 341 current fact-checkers, the Reporters’ Lab database and map also include 112 inactive projects. From 2014 to 2020, an average of 15 fact-checking projects a year close down. Limited funding and expiring grants are among  the most common reasons fact-checkers shuttered their sites. But there also are short-run, election year projects and partnerships that intentionally close down once the voting is over. Of all the inactive projects, 38 produced fact-checks for a year or less. The average lifespan of an inactive fact-checker is two years and three months. The active fact-checkers have been in business twice as long – an average of more than four and a half years.

The Reporters’ Lab process for selecting fact-checkers for its database is similar to the standards used by the International Fact Checking Network – a project based at the Poynter Institute in St. Petersburg, Florida. IFCN currently involves 109 organizations that each agree to a code of principles. The Lab’s database includes all the IFCN signatories, but it also counts any related outlets – such as the state-level news partners of PolitiFact in the United States, the wide network of multilingual fact-checking sites that France’s AFP has built across its global bureau system, and the fact-checking teams Africa Check and PesaCheck have mobilized in countries across Africa. 

Reporters’ Lab project manager Erica Ryan and student researchers Amelia Goldstein and Leah Boyd contributed to this year’s report.

About the census: Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for earlier years in this report are higher than the counts the Lab included in earlier reports. If you have questions, updates or additions, please contact Mark Stencel or Joel Luther.

Ecuador Verifica
Image at top: The fact-checking collaborative Ecuador Verifica (ecuadorverifica.org) launched in January with a traffic-light metaphor to rate claims. The site was one of the 19 new fact-checking projects the Reporters’ Lab added to its database in the past year.

Related Links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018

June 2019

June 2020

Comments closed

Fact-checking count tops 300 for the first time

The number of active fact-checkers around the world has topped 300 — about 100 more than the Duke Reporters’ Lab counted this time a year ago

Some of that growth is due to the 2020 election in the United States, where the Lab’s global database and map now finds 58 fact-checking projects. That’s more than twice as many as any other country, and nearly a fifth of the current worldwide total: 304 in 84 countries. 

But the U.S. is not driving the worldwide increase.

The last U.S. presidential election sounded an alert about the effects of misinformation, especially on social media. But those concerns weren’t just “made in America.” From the 2016 Brexit vote in the U.K. to this year’s coronavirus pandemic, events around the globe have led to new fact-checking projects that call out rumors, debunk hoaxes and help the public identify falsehoods. 

The current fact-checking tally is up 14 from the 290 the Lab reported in its annual fact-checking census in June.

Over the past four years, growth in the U.S. has been sluggish — at least compared with other parts of the world, where Facebook, WhatsApp and Google have provided grants and incentives to enlist fact-checkers help in thwarting misinformation on their platforms. (Disclosure: Facebook and Google also provided support for research at the Reporters’ Lab.)

By looking back at the dates when each fact-checker began publishing, we now see there were about 145 projects in 59 countries that were active at some point in 2016. Of that 145, about a third were based in the United States.

The global total more than doubled from 2016 to now. And the number outside the U.S. increased two and half times — from 97 to 246.

During that same four years, there were relatively big increases elsewhere. Several countries in Asia saw big growth spurts — including Indonesia (which went from 3 fact-checkers to 9), South Korea (3 to 11) and India (3 to 21).

In comparison, the U.S. count in that period is up from 48 to 58.

The comparison is also striking when counting the fact-checkers by continent. The number in South America doubled while the counts for Africa and Asia more than tripled. The North American count was up too — by a third. But the non-U.S. increase in North America was more in line with the pace elsewhere, nearly tripling from 5 to 14. 

Fact-checkers 2016-20
Source: Duke Reporters’ Lab

These global tallies leave out 19 other fact-checkers that launched since 2016 that are no longer active. Among those 19 were short-lived, election-focused initiatives, sometimes involving multiple news partners, in France, Norway, Mexico, Sweden, Nigeria, Philippines, Argentina and the European Union.

Several factors seem to account for the slower growth in the U.S. For instance, many of the country’s big news media outlets have already done fact-checking for years, especially during national elections. So there is less room for fact-checking to grow at that level. 

USA Today was among the only major media newcomers to the national fact-checking scene in the U.S. since 2016. The others were more niche, including The Daily Caller’s Check Your Fact, the Poynter Institute’s MediaWise Teen Fact-Checking Network and The Dispatch. In addition, the French news service AFP started a U.S.-based effort as part of its efforts to establish fact-checking teams in many of its dozens of international bureaus. The National Academies of Sciences, Engineering and Medicine also launched a fact-checking service called “Based on Science” — one of a number of science- and health-focused fact-checking projects around the world.

Of the 58 U.S. fact-checkers, 36 are focused on state and local politics, especially during regional elections. While some of these local media outlets have been at it for years, including some of PolitiFact’s longstanding state-level news partners, others work on their own, such as WISC-TV in Madison, Wisconsin, which began its News 3 Reality Check segments in 2004. There also are one-off election projects that come to an end as soon as the voting is over.

A wildcard in our Lab’s current U.S. count are efforts to increase local fact-checking across large national news chains. One such newcomer since the 2016 election is Tegna, a locally focused TV company with more than 50 stations across the country. It encourages its stations’ news teams to produce fact-checking reports as part of the company’s “Verify” initiative — though some stations do more regular fact-checking than others. Tegna also has a national fact-checking team that produces segments for use by its local stations. A few other media chains are mounting similar efforts, including some of the local stations owned by Nexstar Inc. and more than 260 newspapers and websites operated by USA Today’s owner, Gannett. Those are promising signs. 

There’s still plenty of room for more local fact-checking in the U.S. At least 20 states have one or more regionally focused fact-checking projects already. The Reporters’ Lab is keeping a watchful eye out for new ventures in the other 30. 

Note about our methodology: Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects and new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for 2016 used in this article are higher than the counts the Lab reported annual fact-checking census from February 2017. If you have questions or updates, please contact Mark Stencel or Joel Luther.

Related Links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018 

June 2019

June 2020

Comments closed