Image shows logo for Global Fact 8.

Reporters’ Lab Takes Part in Eighth ‘Global Fact’ Summit

The Reporters’ Lab team participated in five conference sessions and hosted a daily virtual networking table at the conference with more than 1,000 attendees.

By Joel Luther – November 8, 2021 | Print this article

The Duke Reporters’ Lab spent this year’s eighth Global Fact conference helping the world’s fact-checkers learn more about tagging systems that can extend the reach of their work; encouraging a sense of community among organizations around the globe; and discussing new research that offers potent insights into how fact-checkers do their jobs.

This year’s Global Fact took place virtually for the second time, following years of meeting in person all around the world, in cities such as London, Buenos Aires, Madrid, Rome, and Cape Town. More than 1,000 fact-checkers, academic researchers, industry experts, and representatives from technology companies attended the virtual conference.

Over three days, the Reporters’ Lab team participated in five conference sessions and hosted a daily virtual networking table.

  • Reporters’ Lab director and IFCN co-founder Bill Adair delivered opening remarks for the conference, focused on how fact-checkers around the world have closely collaborated in recent years.
  • Mark Stencel, co-director of the Reporters’ Lab, moderated the featured talk with Tom Rosenstiel, the Eleanor Merrill Visiting Professor on the Future of Journalism at the Philip Merrill College of Journalism at the University of Maryland and coauthor of The Elements of Journalism. Rosenstiel previously served as executive director of the American Press Institute. He discussed research into how the public responds to the core values of journalism and how fact-checkers might be able to build more trust with their audience.
  • Thomas Van Damme presented findings from his master’s thesis, “Global Trends in Fact-Checking: A Data-Driven Analysis of ClaimReview,” during a panel discussion moderated by Lucas Graves and featuring Joel Luther of the Reporters’ Lab and Karen Rebelo, a fact-checker from BOOM in India. Van Damme’s analysis reveals fascinating trends from five years of ClaimReview data and demonstrates ClaimReview’s usefulness for academic research.
  • Luther also prepared two pre-recorded webinars that were available throughout the conference:

In addition, the Reporters’ Lab is excited to reconnect with fact-checkers again at 8 a.m. Eastern on Wednesday, November 10, for a feedback session on MediaReview. We’re pleased to report that fact-checkers have now used MediaReview to tag their fact-checks of images and videos 841 times, and we’re eager to hear any additional feedback and continue the open development process we began in 2019 in close collaboration with the IFCN.

Back to top

Keeping a sense of community in the IFCN

For Global Fact 8, a reminder of the IFCN's important role bringing fact-checkers together.

By Bill Adair – October 20, 2021 | Print this article

My opening remarks for Global Fact 8 on Oct. 20, 2021,  delivered for the second consecutive year from Oslo, Norway.

Thanks, Baybars!

Welcome to Norway! 

(Pants on Fire!)

It’s great to be here once again among your fiords and gnomes and your great Norwegian meatballs! 

(Pants on Fire!)

What….I’m not in Norway?

Well, it turns out I’m still in Durham…again! 

And once again we are joined together through the magic of video and more importantly by our strong sense of community. That’s the theme of my remarks today.

Seven years ago, a bunch of us crammed into a classroom in London. I had organized the conference with Poynter because I had heard from several of you that there was a desire for us to come together. It was a magical experience that we all had in the London School of Economics. We were able to discuss our common experiences and challenges. 

As I noted in a past speech, one of our early supporters, Tom Glaisyer of the Democracy Fund, gave us some critical advice when I was planning the meeting with the folks at Poynter. Tom said, “Build a community, not an association.” His point was that we should be open and welcoming and that we shouldn’t erect barriers about who could take part in our group. That’s been an important principle in the IFCN and one that’s been possible with Poynter as our home.

You can see the community every week in our email listserv. Have you looked at some of those threads? Lately they’ve helped fact-checkers find the status of COVID passports in countries around the world, learn which countries allow indoor dining and which were still in lockdown. All of that is possible because of the wonderful way we help each other.

Global Fact keeps getting bigger and bigger. It was so big in Cape Town that we needed a drone to take our group photo. At this rate, for our next-get-together, we’ll need to take the group photo from a satellite.

Tom’s advice has served us really well. By establishing the IFCN as a program within the Poynter Institute, a globally renowned journalism organization, we have not only built a community, we avoided the bureaucracy and frustration of creating a whole new organization.

We stood up the IFCN quickly, and it became a wonderfully global organization, with a staff and advisory board that represents a mix of fact-checkers from every continent — except for Antarctica (at least not yet!). 

Our community succeeded in creating a common Code of Principles that may well be the only ethical framework in journalism that includes a real verification and enforcement mechanism. 

The Poynter-based IFCN, with its many connections in journalism and tech, has raised millions of dollars for fact-checkers all over the world. 

And we have done all this without bloated overhead, new legal entities and insular meetings that would distract us from our real work — finding facts and dispelling bullshit. For most fact-checkers, running our own organizations or struggling for resources within our newsrooms is already time-consuming enough.

As we look to the future, some fact-checkers from around the world have offered ideas at how the IFCN can improve. I like many of their suggestions.

Let’s start with the money the IFCN distributes. The fundraising I mentioned is amazing — more than $3 million since March 2020. It’s pretty cool how that gets distributed – All of that money came from major tech companies in the United States and 84% of the money goes to fact-checkers OUTSIDE the US. 

But we can be even more transparent about all of that, just as IFCN’s principles demand transparency of its signatories.  We can also continue to expand the advisory board to be even more representative of our growing community.

Some other improvements:

We should demand more data and transparency from our funders in the tech community. Fact-checkers also can advocate to make sure that our large tech partners treat members of our community fairly. And we can work together more closely to find new sources of revenue to pay for our work, whether that’s through IFCN or other collaborations. 

One possible way is to arrange a system so fact-checkers can get paid for publishing fact-checks with ClaimReview, the tagging system that our Reporters’ Lab developed with Google and Jigsaw. (A bit of our own transparency – they supported our work on that and a similar product for images and video called MediaReview.) Our goal at Duke is to help fact-checkers expand their audiences and create new ways for you to get paid for your important work. 

Our community also needs more diverse funding sources, to avoid relying too heavily on any one company or sector. But we also need to be realistic and recognize the financial and legal limitations of the funders, and of our fact-checkers, which represent an incredibly wide range of business models. Some of you have good ideas about that. And we should be talking more about all of that. 

The IFCN and Global Fact provide essential venues for us to discuss these issues and make progress together – as do the regional fact-checking collaboratives and networks, from Latin America to Central Europe to Africa, and the numerous country-specific collaborations in Japan, Indonesia and elsewhere. What a dazzling movement we have built – together. 

If there’s a message in all this is that all of us need to convene and talk more often. The pandemic has made that difficult. This is the second year we have to meet virtually — and like most of you, I too am sick of talking to my laptop, as I am now.

For now, though, let’s be grateful for the community we have. It’s sunny here in Norway today 

[Pants on Fire!] 

I’m looking forward to seeing you in person next year!

 

 

 

Back to top

MediaReview Testing Expands to a Global Userbase

The Duke Reporters’ Lab is launching the next phase of development of MediaReview, a tagging system that fact-checkers can use to identify whether a video or image has been manipulated.

By Joel Luther – June 3, 2021 | Print this article

The Duke Reporters’ Lab is launching the next phase of development of MediaReview, a tagging system that fact-checkers can use to identify whether a video or image has been manipulated.

Conceived in late 2019, MediaReview is a sibling to ClaimReview, which allows fact-checkers to clearly label their articles for search engines and social media platforms. The Reporters’ Lab has led an open development process, consulting with tech platforms like Google, YouTube and Facebook, and with fact-checkers around the world.

Testing of MediaReview began in April 2020 with the Lab’s FactStream partners: PolitiFact, FactCheck.org and The Washington Post. Since then, fact-checkers from those three outlets have logged more than 300 examples of MediaReview for their fact-checks of images and videos.

We’re ready to expand testing to a global audience and we’re pleased to announce that fact-checkers can now add MediaReview to their fact-checks through Google’s Fact Check Markup Tool, a tool which many of the world’s fact-checkers currently use to create ClaimReview. This will bring MediaReview testing to more fact-checkers around the world, the next step in the open process that will lead to a more refined final product.

ClaimReview was developed through a partnership of the Reporters’ Lab, Google, Jigsaw, and Schema.org. It provides a standard way for publishers of fact-checks to identify the claim being checked, the person or entity that made the claim, and the conclusion of the article. This standardization enables search engines and other platforms to highlight fact-checks, and can power automated products such as the FactStream and Squash apps being developed in the Reporters’ Lab.

Likewise, MediaReview aims to standardize the way fact-checkers talk about manipulated media. The goal is twofold: to allow fact-checkers to provide information to the tech platforms that a piece of media has been manipulated, and to establish a common vocabulary to describe types of media manipulation. By communicating clearly in consistent ways, independent fact-checkers can play an important role in informing people around the world.

The Duke Reporters’ Lab has led the open process to develop MediaReview, and we are eager to help fact-checkers get started with testing it. Contact Joel Luther for questions or to set up a training session. International Fact-Checking Network signatories who have questions about the process can contact the IFCN.

For more information, see the new MediaReview section of our ClaimReview Project website.

Back to top

Ecuador Verifica

Fact-checking census shows slower growth

The number of new projects dipped, even as fact-checking reached more countries than ever

By Mark Stencel & Joel Luther – June 2, 2021 | Print this article

Fact-checkers are now found in at least 102 countries – more than half the nations in the world. 

The latest census by the Duke Reporters’ Lab identified 341 active fact-checking projects, up 51 from last June’s report.

But after years of steady and sometimes rapid growth, there are signs that trend is slowing, even though misleading content and political lies have played a growing role in contentious elections and the global response to the coronavirus pandemic.

Our tally revealed a slowdown in the number of new fact-checkers, especially when we looked at the upward trajectory of projects since the Lab began its yearly survey and global fact-checking map seven years ago. 

The number of fact-checking projects that launched since the most recent Reporters’ Lab census was more than three times fewer than the number that started in the 12 months before that, based on our adjusted tally. 

From July 2019 to June 2020, there were 61 new fact-checkers. In the year since then, there were 19.

Meanwhile, 21 fact-checkers shut down in that same two-year period beginning in June 2019. And 54 additions to the Duke database in that same period were fact-checkers that were already up and running prior to the 2019 census.

Looking at the count by calendar year also underscored the slowdown in the time of COVID. 

The Reporters’ Lab counted 36 fact-checking projects that launched in 2020. That was below the annual average of 53 for the preceding six calendar years – and less than half the number of startups that began fact-checking in 2019. The 2020 launches were also the lowest number of new fact-checkers we’ve counted since 2014. 

New Fact Checkers by Year

New Fact Checkers by Year
Duke Reporters’ Lab

(Note: The adjusted number of 2020 launches may increase slightly over time as the Reporters’ Lab identifies other fact-checkers we have not yet discovered.)

The slowdown comes after a period of rapid expansion that began in 2016. That was the year when the Brexit vote in the United Kingdom and the presidential race in the United States raised public alarm about the impact of misinformation.

In response, major tech companies such as Facebook and Google elevated fact-checks on their platforms and provided grants, direct funding and other incentives for new and existing fact-checking organizations. (Disclosure: Google and Facebook fund some of the Duke lab’s research on technologies for fact-checkers. )

The 2018-2020 numbers presented below are adjusted from earlier census reports to include fact-checkers that were subsequently added to our database. 

Active Fact-Checkers by Year

2021 Fact-Checking Census
Duke Reporters’ Lab

Note: 2021 YTD includes one fact-checker that closed in 2021. 

Growth has been steady on almost every continent except in North America. In the United States, where fact-checking first took off in the early 2010s, there are 61 active fact-checkers now. That’s down slightly from the 2020 election year, when there were 66. But the U.S. is still home to more fact-checking projects than any other country. Of the current U.S. fact-checkers, more than half (35 of 61) focus on state and local politics. 

Fact-Checkers by Continent

Fact-Checkers by Continent
Duke Reporters’ Lab

Among other details we found in this year’s census:

  • More countries, more staying power: Based on our adjusted count, fact-checkers were active in at least 47 countries in 2014. That more than doubled to 102 now. And most of the fact-checkers that started in 2014 or earlier (71 out of 122) are still active today.

 

  • Fact-checking is more multilingual: The active fact-checkers produce reports in nearly 70 languages, from Albanian to Urdu. English is the most common, used on 146 different sites, followed by Spanish (53), French (33), Arabic (14), Portuguese (12), Korean (11) and German (10). Fact-checkers in multilingual countries often present their work in more than one language – either in translation on the same site, or on different sites tailored for specific language communities, including original reporting for those audiences.

 

  • More than media: Half of the current fact-checkers (195 of 341) are affiliated with media organizations, including national news publishers and broadcasters, local news sources and digital-only outlets. But there are other models, too. At least 37 are affiliated with non-profit groups, think tanks and nongovernmental organizations and 26 are affiliated academic institutions. Some of the fact-checkers involve cross-organization partnerships and have multiple affiliations. But to be listed in our database, the fact-checking must be organized and produced in a journalistic fashion.

 

  • Turnover: In addition to the 341 current fact-checkers, the Reporters’ Lab database and map also include 112 inactive projects. From 2014 to 2020, an average of 15 fact-checking projects a year close down. Limited funding and expiring grants are among  the most common reasons fact-checkers shuttered their sites. But there also are short-run, election year projects and partnerships that intentionally close down once the voting is over. Of all the inactive projects, 38 produced fact-checks for a year or less. The average lifespan of an inactive fact-checker is two years and three months. The active fact-checkers have been in business twice as long – an average of more than four and a half years.

The Reporters’ Lab process for selecting fact-checkers for its database is similar to the standards used by the International Fact Checking Network – a project based at the Poynter Institute in St. Petersburg, Florida. IFCN currently involves 109 organizations that each agree to a code of principles. The Lab’s database includes all the IFCN signatories, but it also counts any related outlets – such as the state-level news partners of PolitiFact in the United States, the wide network of multilingual fact-checking sites that France’s AFP has built across its global bureau system, and the fact-checking teams Africa Check and PesaCheck have mobilized in countries across Africa. 

Reporters’ Lab project manager Erica Ryan and student researchers Amelia Goldstein and Leah Boyd contributed to this year’s report.

About the census: Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for earlier years in this report are higher than the counts the Lab included in earlier reports. If you have questions, updates or additions, please contact Mark Stencel or Joel Luther.

Ecuador Verifica
Image at top: The fact-checking collaborative Ecuador Verifica (ecuadorverifica.org) launched in January with a traffic-light metaphor to rate claims. The site was one of the 19 new fact-checking projects the Reporters’ Lab added to its database in the past year.

Related Links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018

June 2019

June 2020

Back to top

Fact-checking count tops 300 for the first time

The Reporters' Lab finds fact-checkers at work in 84 countries -- but growth in the U.S. has slowed

By Mark Stencel & Joel Luther – October 13, 2020 | Print this article

The number of active fact-checkers around the world has topped 300 — about 100 more than the Duke Reporters’ Lab counted this time a year ago

Some of that growth is due to the 2020 election in the United States, where the Lab’s global database and map now finds 58 fact-checking projects. That’s more than twice as many as any other country, and nearly a fifth of the current worldwide total: 304 in 84 countries. 

But the U.S. is not driving the worldwide increase.

The last U.S. presidential election sounded an alert about the effects of misinformation, especially on social media. But those concerns weren’t just “made in America.” From the 2016 Brexit vote in the U.K. to this year’s coronavirus pandemic, events around the globe have led to new fact-checking projects that call out rumors, debunk hoaxes and help the public identify falsehoods. 

The current fact-checking tally is up 14 from the 290 the Lab reported in its annual fact-checking census in June.

Over the past four years, growth in the U.S. has been sluggish — at least compared with other parts of the world, where Facebook, WhatsApp and Google have provided grants and incentives to enlist fact-checkers help in thwarting misinformation on their platforms. (Disclosure: Facebook and Google also provided support for research at the Reporters’ Lab.)

By looking back at the dates when each fact-checker began publishing, we now see there were about 145 projects in 59 countries that were active at some point in 2016. Of that 145, about a third were based in the United States.

The global total more than doubled from 2016 to now. And the number outside the U.S. increased two and half times — from 97 to 246.

During that same four years, there were relatively big increases elsewhere. Several countries in Asia saw big growth spurts — including Indonesia (which went from 3 fact-checkers to 9), South Korea (3 to 11) and India (3 to 21).

In comparison, the U.S. count in that period is up from 48 to 58.

The comparison is also striking when counting the fact-checkers by continent. The number in South America doubled while the counts for Africa and Asia more than tripled. The North American count was up too — by a third. But the non-U.S. increase in North America was more in line with the pace elsewhere, nearly tripling from 5 to 14. 

Fact-checkers 2016-20
Source: Duke Reporters’ Lab

These global tallies leave out 19 other fact-checkers that launched since 2016 that are no longer active. Among those 19 were short-lived, election-focused initiatives, sometimes involving multiple news partners, in France, Norway, Mexico, Sweden, Nigeria, Philippines, Argentina and the European Union.

Several factors seem to account for the slower growth in the U.S. For instance, many of the country’s big news media outlets have already done fact-checking for years, especially during national elections. So there is less room for fact-checking to grow at that level. 

USA Today was among the only major media newcomers to the national fact-checking scene in the U.S. since 2016. The others were more niche, including The Daily Caller’s Check Your Fact, the Poynter Institute’s MediaWise Teen Fact-Checking Network and The Dispatch. In addition, the French news service AFP started a U.S.-based effort as part of its efforts to establish fact-checking teams in many of its dozens of international bureaus. The National Academies of Sciences, Engineering and Medicine also launched a fact-checking service called “Based on Science” — one of a number of science- and health-focused fact-checking projects around the world.

Of the 58 U.S. fact-checkers, 36 are focused on state and local politics, especially during regional elections. While some of these local media outlets have been at it for years, including some of PolitiFact’s longstanding state-level news partners, others work on their own, such as WISC-TV in Madison, Wisconsin, which began its News 3 Reality Check segments in 2004. There also are one-off election projects that come to an end as soon as the voting is over.

A wildcard in our Lab’s current U.S. count are efforts to increase local fact-checking across large national news chains. One such newcomer since the 2016 election is Tegna, a locally focused TV company with more than 50 stations across the country. It encourages its stations’ news teams to produce fact-checking reports as part of the company’s “Verify” initiative — though some stations do more regular fact-checking than others. Tegna also has a national fact-checking team that produces segments for use by its local stations. A few other media chains are mounting similar efforts, including some of the local stations owned by Nexstar Inc. and more than 260 newspapers and websites operated by USA Today’s owner, Gannett. Those are promising signs. 

There’s still plenty of room for more local fact-checking in the U.S. At least 20 states have one or more regionally focused fact-checking projects already. The Reporters’ Lab is keeping a watchful eye out for new ventures in the other 30. 

Note about our methodology: Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects and new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for 2016 used in this article are higher than the counts the Lab reported annual fact-checking census from February 2017. If you have questions or updates, please contact Mark Stencel or Joel Luther.

Related Links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018 

June 2019

June 2020

Back to top

‘It’s great to be here in Norway!’

For the kickoff of Global Fact 7, a celebration of the IFCN's community.

By Bill Adair – June 22, 2020 | Print this article

My opening remarks for Global Fact 7, delivered from Oslo, Norway*, on June 22, 2020.

I’m going to do something a little different this year. I’m going to fact-check my own opening remarks using this antique PolitiFact Pants on Fire button. You know Pants on Fire – it’s our rating for the most ridiculous falsehoods.

First , I want to say that it’s great to be here in Norway! (Pants on Fire!)

I love Norway because I’ve always loved the great plastic furniture I buy from your famous furniture store IKEA! (Pants on Fire!)

I am accompanied today by this Norwegian gnome, which I know is very much a symbol of your great country! (Pants on Fire!)

Okay….I’m actually here in Durham, North Carolina,, but thanks to the magic of pixels and…Baybars!…I’m with you!

First, some news…

Baybars Orsek and Bill Adair during the broadcast for Global Fact 7.

As you probably know, every year before Global Fact, the Duke Reporters’ Lab conducts a census of fact-checking to count the world’s fact-checkers and we are out today with the number. This is the product of painstaking work by Mark Stencel, Joel Luther, Mimi Goldstein and Matthew Griffin. Our count this year is 290 fact-checking projects in 83 countries. That’s up from 188 in 60 countries a year ago.

I heard that and my first thought was that …. fact-checking keeps growing!

I should note that Mark Stencel has been working hard on this, staying up all night for the last few days to get it done. So I should reveal my secret: we pay him in coffee!

This morning, I’m going to talk briefly about community.

Back in 2014, when we started planning the first meeting of the world’s fact-checkers – in which we could all squeeze into a classroom at the London School of Economics –  Tom Glaisyer of the Democracy Fund gave me some important advice. Build a community, Tom said, not an association. The way to help the fact-checking movement was to be inviting and encourage journalists to start fact-checking. We’ve done that because this meeting, and our group, keeps getting larger. You could even say that … fact-checking keeps growing! 

And as a bonus, we also managed to establish the Code of Principles, which provides an important incentive for transparency and fairness in your fact-checking.

My favorite example of the IFCN’s spirit of community is the simplest: our email threads. They are often amazing! A fact-checker will write with a problem they are having and community members from all over the world will respond with suggestions and even help them do the work.

Did you see the amazing one a couple of months ago? Samba of Africa Check wrote about a video that claimed to show violence against Africans in China. He knew it was fake but was not sure where it was from, so he circulated the video by email. That led to a remarkable exchange.

A coordinator from Witness in the United States said the video had been posted on Reddit. suggesting it was from New York. Jacques Pezet of Liberation in France took the image and used Google Maps to find the New York intersection where it was filmed. And then Gordon Farrer, an Australian researcher, used Google Street View to identify the business – a dental office called Brace Yourself.

All of this showed up in Samba’s fact-check in Africa Check in Senegal.

Amazing! All the product of our community!

Another great example: the tremendous work by the IFCN bringing together the world’s fact-checkers to debunk falsehoods about COVID-19. The CoronaVirus Alliance has now collected more than 6,000 fact-checks. It, too, is a product of community, organized by Baybars and Cris Tardaguila.

And one more: MediaReview. You’re going to hear about it tomorrow. It grew out of some great work by the Washington Post and we’ve been working on it with Baybars and PolitiFact and FactCheck.org and fact-checkers from around the world that attended a meeting in January. It’s a new tagging system like ClaimReview that you’ll be able to use for videos and images. I’m more excited about MediaReview than anything I’ve done since PolitiFact because it could really have an impact in the battle against misinformation. 

Finally, I want to give some shoutouts to two marvelous people who embody this commitment to community. Peter Cunliffe-Jones has been an amazing builder who has done extraordinary things to bring fact-checking to Africa. And Laura Zommer has been tireless helping dozens of fact-checkers get started in Latin America.

Together, they show what’s wonderful about the IFCN: they believe in our important journalism and they have given their time and energy to help it grow.

Our community grows thanks to these wonderful leaders. I look forward to sharing a glass of wine with them — and you — next year in Oslo!

Bundle up!

*Actually, these remarks were delivered from my backyard in Durham, North Carolina.

 

Back to top

Annual census finds nearly 300 fact-checking projects around the world

Growth is fueled by politics, protests and pandemic

By Mark Stencel & Joel Luther – June 22, 2020 | Print this article

With elections, unrest and a global pandemic generating a seemingly endless supply of falsehoods, the Duke Reporters’ Lab finds at least 290 fact-checking projects are busy debunking those threats in 83 countries.

That count is up from 188 active projects in more than 60 countries a year ago, when the Reporters’ Lab issued the annual census at the Global Fact Summit in South Africa. There has been so much growth that the number of active fact-checkers added in the past year alone more than doubles the total number when the Lab began to keep track in 2014.

There has been plenty of news to keep those fact-checkers busy, including widespread protests in countries such as Chile and the United States. Events like these attract a broad range of new fact-checkers — some from well-established media companies, as well as industrious startups, good-government groups and journalism schools. 

Our global database and map shows considerable growth in Asia, particularly India, where the Lab currently counts at least 20 fact-checkers at the moment. We also saw a spike in Chile that started with the nationwide unrest there last fall.

Fact-Checkers by Continent Since June 2019

Africa: 9 to 19
Asia: 35 to 75
Australia: 5 to 4
Europe: 61 to 85
North America: 60 to 69
South America: 18 to 38

But the coronavirus pandemic has been the story that has topped almost every fact-checking page in every country since February.

"creíble, pero..."
From protests to pandemic: Chile’s Factchecking.cl’s has given its mascot a mask.

At least five fact-checkers on the Lab’s map already focused on public health and medical claims. One of the newest is The Healthy Indian Project, which launched last year. But the pandemic has turned almost every fact-checking operation into a team of health reporters. And the International Fact-Checking Network also has coordinated coverage through its #CoronaVirusFacts Alliance.

The pandemic has also turned IFCN’s 2020 Global Fact meeting into a virtual conference this week, instead of the in-person gathering originally planned in Oslo, Norway. And one of the themes participants will be talking about are the institutional factors that have generated more interest and attention for fact-checkers. 

To combat increasing online misinformation, major digital platforms in the United States, including Facebook, WhatsApp, Google and YouTube, have provided incentives to fact-checkers, including direct contributions, competitive grants and help with technological infrastructure to increase the distribution of their work. (Disclosure: Facebook and Google separately help fund research and development projects at the Reporters’ Lab, and the Lab’s co-directors have served as judges for some grants.)

Many of these funding opportunities were specifically for signatories of the IFCN’s Code of Principles, a charter that requires independent accessors to regularly review the editorial and ethical practices of each fact-checker that wants its status verified. 

A growing number of fact-checkers are also part of national and regional partnerships. These short-term collaborations can evolve into longer-term partnerships, as we’ve seen with Brazil’s Comprova, Colombia’s RedCheq and Bolivia Verifica. They also can inspire participating organizations to continue fact-checking on their own. 

Over time, the Reporters’ Lab has tried to monitor these contributors and note when they have developed into fact-checkers that should be listed in their own right. That’s why our database shows considerable growth in South Korea — home to the long-standing SNU FactCheck partnership based at Seoul National University’s Institute of Communications Research.

As has been the case with each year’s census, some of the growth also comes from established fact-checkers that came to the attention of the Reporters’ Lab after last June’s census was published — offset by at least nine projects that closed down in the months that followed.

But the overall trend was still strong. Overall, 68 of the projects in the database launched since the start of 2019. And more than half of them (40 of the 68) opened for business after the 2019 census, including 11 so far in the first half of 2020. And most of them appear to have staying power. Of those 68, only four are no longer operating. And three of those were election-related collaborations that launched as intentionally short-term projects.

We also have tried to be more thorough about discerning among specific projects and outlets that are produced or distributed by different teams within the same or related organizations. The variety of strong fact-checking programs and web pages produced by variously affiliated French public broadcasters is a good example. (Here’s how we decide which fact-checkers to include in the Reporters’ Lab database.)

The increasing tally of fact-checkers, which continues a trend that started in 2014, is remarkable. While this is a challenging time for journalism in just about every country, public alarm about the effects of misinformation is driving demand for credible reporting and research — the work a growing list of fact-checkers are busy doing around the world.

The Reporters’ Lab is grateful for the contributions of student researchers Amelia Goldstein and Matthew Griffin; journalist and media/fact-checking trainer Shady Gebril; fact-checkers Enrique Núñez-Mussa of FactCheckingCL and EunRyung Chong of SNU FactCheck; and the staff of the International Fact-Checking Network. The Reporters’ Lab updates its fact-checking database throughout the year. If you have updates or information, please contact Mark Stencel and Joel Luther.

Related Links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018 

June 2019

Back to top

Pop-up fact-checking moves online: Lessons from our user experience testing

After it became clear pop-up fact-checking was too difficult to display on a TV, we've moved to the web.

By Jessica Mahone – June 11, 2020 | Print this article

We initially wanted to build pop-up fact-checking for a TV screen. But for nearly a year, people have told us in surveys and in coffee shops that they like live fact-checking but they need more information than they can get on a TV.

The testing is a key part of our development of Squash, our groundbreaking live fact-checking product. We started by interviewing a handful of users of our FactStream app. We wanted to know how they found out about the app, how they find fact checks about things they hear on TV, and what they would need to trust live fact-checking. As we saw in our “Red Couch Experiments” in 2018, they were excited about the concept but they wanted more than a TV screen allowed. 

We supplemented those interviews with conversations in coffee shops – “guerilla research” in user experience (UX) terms. And again, the people we spoke with were excited about the concept but wanted more information than a 1740×90 pixel display could accommodate.

The most common request was the ability to access the full published fact-check. Some wanted to know if more than one fact-checker had vetted the claim, and if so, did they all reach the same conclusion? Some just wanted to be able to pause the video. 

Since those things weren’t possible with a conventional TV display, we pivoted and began to imagine what live fact-checking would look like on the web. 

Bringing Pop-Up Fact-Checking to the Web

In an online whiteboard session, our Duke Tech and Check Cooperative team discussed many possibilities for bringing live fact-checking online, and then, our UX team — students Javan Jiang and Dora Pekec and myself — designed a new interface for live fact-checking and tested it in a series of simple open-ended preference surveys. 

In total, 100 people responded to these surveys, in addition to the eight interviews above and a large experiment with 1,500 participants we did late last year about whether users want ratings in on-screen displays (they do). 

A common theme emerged in the new research: Make live fact-checking as non-disruptive to the viewing experience as possible. More specifically, we found three things that users want and need from the live fact-checking experience.

  • Users prefer a fact-checking display beneath the video. In our initial survey, users could choose if they liked a display beside or beneath the video. About three-quarters of respondents said that a display beneath the video was less disruptive to their viewing, with several telling us that this placement was similar to existing video platforms such as YouTube. 
  •  Users need “persistent onboarding” to make use of the content they get from live fact-checking. A user guide or FAQ is not enough. Squash can’t yet provide real-time fact-checking. It is a system that matches claims made during a televised event to claims previously checked. But users need to be reminded that they are seeing a “related fact-check,” not necessarily a perfect match to the claim they just heard. “Persistent onboarding” means providing users with subtle reminders in the display. For example, when a user hovers over the label “Related Fact Check,” a small box could explain that this is not a real-time fact check but an already published fact check about a similar claim made in the past. This was one of the features users liked most because it kept them from having to find the information themselves.
  • Users prefer all the information that is available on the initial screen. Our first test allowed users to expand the display to see more information about the fact check, such as the publisher of the fact check and an explanation of what statement triggered the system to display a fact check. But users said that having to toggle the display to see this information was disruptive. 
Users told us they wanted more on-screen explanations, sometimes called “persistent onboarding.”

More to Learn

Though we’ve learned a lot, some big questions remain. We still don’t know what live fact-checking looks like under less-than-ideal conditions. For example, how would users react to a fact check when the spoken claim is true but the relevant fact check is about a claim that was false? 

And we need to figure out timing, particularly for multi-speaker events such as debates. When is the right time to display a fact-check after a politician has spoken? And what if the screen is now showing another politician?

And how can we appeal to audiences that are skeptical of fact-checking? One respondent specifically said he’d want to be able to turn off the display because “none of the fact-checkers are credible.” What strategies or content would help make such audiences more receptive to live fact-checking? 

As we wrestle with those questions, moving live fact-checking to the web still opens up new possibilities, such as the ability to pause content (we call that “DVR mode”), read fact-checks,  and return to the event. We are hopeful this shift in platform will ultimately bring automated fact-checking to larger audiences.

Back to top

What is MediaReview?

FAQs on the new schema we're helping to develop for fact-checks of images and videos.

By Joel Luther – June 11, 2020 | Print this article

MediaReview is a schema – a tagging system that web publishers can use to identify different kinds of content. Built specifically for fact-checkers to identify manipulated images and videos, we think of it as a sibling to ClaimReview, the schema developed by the Reporters’ Lab that allows fact-checkers to identify their articles for search engines and social media platforms.

By tagging their articles with MediaReview, publishers are essentially telling the world, “this is a fact-check of an image or video that may have been manipulated.” The goal is twofold: to allow fact-checkers to provide information to the tech platforms that a piece of media has been manipulated, and to establish a common vocabulary to describe types of media manipulation.

We hope these fact-checks will provide the tech companies with valuable new signals about misinformation. We recognize that they are independent from the journalists doing the fact-checking and it is entirely up to them if, and how, they use the signals. Still, we’re encouraged by the interest of the tech companies in this important journalism. By communicating clearly with them in consistent ways, independent fact-checkers can play an important role in informing people around the world.

Who created MediaReview?

The idea for a taxonomy to describe media manipulation was first proposed at our 2019 Tech & Check conference by Phoebe Connelly and Nadine Ajaka of the Washington Post. Their work eventually became The Fact Checker’s Guide to Manipulated Video, which heavily inspired the first MediaReview proposal.

The development of MediaReview has been an open process. A core group of representatives from the Reporters’ Lab, the tech companies, and the Washington Post led the development, issuing open calls for feedback throughout the process. We’ve worked closely with the International Fact Checking Network to ensure that fact-checkers operating around the world have been able to provide feedback. 

You can still access the first terminology proposal and the first structured data proposal, as well as comments offered on those documents.

What is the current status of MediaReview?

MediaReview is currently in pending status on Schema.org, which oversees the tagging that publishers use, which means it is still under development. 

The Duke Reporters’ Lab is testing the current version of MediaReview with several key fact-checkers in the United States: FactCheck.org, PolitiFact and The Washington Post.

You can see screenshots of our current MediaReview form, including working labels and definitions here: Claim Only, Video, Image.

We’re also sharing test MediaReview data as it’s entered by fact-checkers. You can access a spreadsheet of fact-checks tagged with MediaReview here.

How can I offer feedback?

Through our testing with fact-checkers and with an ever-expanding group of misinformation experts, we’ve identified a number of outstanding issues that we’re soliciting feedback on. Please comment on the linked Google Doc with your thoughts and suggestions.

We’re also proposing new Media Types and Ratings to address some of the outstanding issues, and we’re seeking feedback on those as well.

Back to top

We want your feedback on the MediaReview tagging system

The new tagging system will allow fact-checkers to alert tech platforms about false videos and fake images.

By Bill Adair – June 9, 2020 | Print this article

Last fall, we launched an ambitious effort to develop a new tagging system for fact-checks of fake videos and images. The idea was to take the same approach that fact-checkers use when they check claims by politicians and political groups, a system called ClaimReview, and build something of a sequel. We called it MediaReview.

For the past nine months, Joel Luther, Erica Ryan and I have been talking with fact-checkers, representatives of the tech companies and other leaders in the battle against misinformation. Our ever-expanding group has come up with a great proposal and would love your feedback.

Like ClaimReview, MediaReview is schema – a tagging system that web publishers can use to identify different kinds of content. By tagging their articles, the publishers are essentially telling the world, “This is a fact-check on this politician on this particular claim.” That can be a valuable signal to tech companies, which can decide if they want to add labels to the original content or demote its standing in a feed, or do nothing. It’s up to them.

(Note: Google and Facebook have supported the work of The Reporters’ Lab and have given us grants to develop MediaReview.)

ClaimReview, which we developed with Google and Schema.org five years ago, has been a great success. It is used by more than half of the world’s fact-checkers and has been used to tag more than 50,000 articles. Those articles get highlighted in Google News and in search results on Google and YouTube.

We’re hopeful that MediaReview will be equally successful. By responding quickly to fake videos and bogus images, fact-checkers can provide the tech platforms with vital information about false content that might be going viral. The platforms can then decide if they want to take action.

The details are critical. We’ve based MediaReview on a taxonomy developed by the Washington Post. We’re still discussing the names of the labels, so feel free to make suggestions about the labels – or anything.

You can get a deeper understanding of MediaReview in this article in NiemanLab.

You can see screenshots of our current MediaReview form, including working labels and definitions here: Claim Only, Video, Image.

You can see our distillation of the current issues and add your comments here.

Back to top