Category: Fact-Checking News
Fact-Checking News
Pop-up fact-checking moves online: Lessons from our user experience testing

After it became clear pop-up fact-checking was too difficult to display on a TV, we've moved to the web.
By Jessica Mahone - June 11, 2020
We initially wanted to build pop-up fact-checking for a TV screen. But for nearly a year, people have told us in surveys and in coffee shops that they like live fact-checking but they need more information than they can get on a TV.
The testing is a key part of our development of Squash, our groundbreaking live fact-checking product. We started by interviewing a handful of users of our FactStream app. We wanted to know how they found out about the app, how they find fact checks about things they hear on TV, and what they would need to trust live fact-checking. As we saw in our “Red Couch Experiments” in 2018, they were excited about the concept but they wanted more than a TV screen allowed.
We supplemented those interviews with conversations in coffee shops – “guerilla research” in user experience (UX) terms. And again, the people we spoke with were excited about the concept but wanted more information than a 1740×90 pixel display could accommodate.
The most common request was the ability to access the full published fact-check. Some wanted to know if more than one fact-checker had vetted the claim, and if so, did they all reach the same conclusion? Some just wanted to be able to pause the video.
Since those things weren’t possible with a conventional TV display, we pivoted and began to imagine what live fact-checking would look like on the web.
Bringing Pop-Up Fact-Checking to the Web
In an online whiteboard session, our Duke Tech and Check Cooperative team discussed many possibilities for bringing live fact-checking online, and then, our UX team — students Javan Jiang and Dora Pekec and myself — designed a new interface for live fact-checking and tested it in a series of simple open-ended preference surveys.
In total, 100 people responded to these surveys, in addition to the eight interviews above and a large experiment with 1,500 participants we did late last year about whether users want ratings in on-screen displays (they do).
A common theme emerged in the new research: Make live fact-checking as non-disruptive to the viewing experience as possible. More specifically, we found three things that users want and need from the live fact-checking experience.
- Users prefer a fact-checking display beneath the video. In our initial survey, users could choose if they liked a display beside or beneath the video. About three-quarters of respondents said that a display beneath the video was less disruptive to their viewing, with several telling us that this placement was similar to existing video platforms such as YouTube.
- Users need “persistent onboarding” to make use of the content they get from live fact-checking. A user guide or FAQ is not enough. Squash can’t yet provide real-time fact-checking. It is a system that matches claims made during a televised event to claims previously checked. But users need to be reminded that they are seeing a “related fact-check,” not necessarily a perfect match to the claim they just heard. “Persistent onboarding” means providing users with subtle reminders in the display. For example, when a user hovers over the label “Related Fact Check,” a small box could explain that this is not a real-time fact check but an already published fact check about a similar claim made in the past. This was one of the features users liked most because it kept them from having to find the information themselves.
- Users prefer all the information that is available on the initial screen. Our first test allowed users to expand the display to see more information about the fact check, such as the publisher of the fact check and an explanation of what statement triggered the system to display a fact check. But users said that having to toggle the display to see this information was disruptive.

More to Learn
Though we’ve learned a lot, some big questions remain. We still don’t know what live fact-checking looks like under less-than-ideal conditions. For example, how would users react to a fact check when the spoken claim is true but the relevant fact check is about a claim that was false?
And we need to figure out timing, particularly for multi-speaker events such as debates. When is the right time to display a fact-check after a politician has spoken? And what if the screen is now showing another politician?
And how can we appeal to audiences that are skeptical of fact-checking? One respondent specifically said he’d want to be able to turn off the display because “none of the fact-checkers are credible.” What strategies or content would help make such audiences more receptive to live fact-checking?
As we wrestle with those questions, moving live fact-checking to the web still opens up new possibilities, such as the ability to pause content (we call that “DVR mode”), read fact-checks, and return to the event. We are hopeful this shift in platform will ultimately bring automated fact-checking to larger audiences.
What is MediaReview?

FAQs on the new schema we're helping to develop for fact-checks of images and videos.
By Joel Luther - June 11, 2020
MediaReview is a schema – a tagging system that web publishers can use to identify different kinds of content. Built specifically for fact-checkers to identify manipulated images and videos, we think of it as a sibling to ClaimReview, the schema developed by the Reporters’ Lab that allows fact-checkers to identify their articles for search engines and social media platforms.
By tagging their articles with MediaReview, publishers are essentially telling the world, “this is a fact-check of an image or video that may have been manipulated.” The goal is twofold: to allow fact-checkers to provide information to the tech platforms that a piece of media has been manipulated, and to establish a common vocabulary to describe types of media manipulation.
We hope these fact-checks will provide the tech companies with valuable new signals about misinformation. We recognize that they are independent from the journalists doing the fact-checking and it is entirely up to them if, and how, they use the signals. Still, we’re encouraged by the interest of the tech companies in this important journalism. By communicating clearly with them in consistent ways, independent fact-checkers can play an important role in informing people around the world.
Who created MediaReview?
The idea for a taxonomy to describe media manipulation was first proposed at our 2019 Tech & Check conference by Phoebe Connelly and Nadine Ajaka of the Washington Post. Their work eventually became The Fact Checker’s Guide to Manipulated Video, which heavily inspired the first MediaReview proposal.
The development of MediaReview has been an open process. A core group of representatives from the Reporters’ Lab, the tech companies, and the Washington Post led the development, issuing open calls for feedback throughout the process. We’ve worked closely with the International Fact Checking Network to ensure that fact-checkers operating around the world have been able to provide feedback.
You can still access the first terminology proposal and the first structured data proposal, as well as comments offered on those documents.
What is the current status of MediaReview?
MediaReview is currently in pending status on Schema.org, which oversees the tagging that publishers use, which means it is still under development.
The Duke Reporters’ Lab is testing the current version of MediaReview with several key fact-checkers in the United States: FactCheck.org, PolitiFact and The Washington Post.
You can see screenshots of our current MediaReview form, including working labels and definitions here: Claim Only, Video, Image.
We’re also sharing test MediaReview data as it’s entered by fact-checkers. You can access a spreadsheet of fact-checks tagged with MediaReview here.
How can I offer feedback?
Through our testing with fact-checkers and with an ever-expanding group of misinformation experts, we’ve identified a number of outstanding issues that we’re soliciting feedback on. Please comment on the linked Google Doc with your thoughts and suggestions.
We’re also proposing new Media Types and Ratings to address some of the outstanding issues, and we’re seeking feedback on those as well.
We want your feedback on the MediaReview tagging system

The new tagging system will allow fact-checkers to alert tech platforms about false videos and fake images.
By Bill Adair - June 9, 2020
Last fall, we launched an ambitious effort to develop a new tagging system for fact-checks of fake videos and images. The idea was to take the same approach that fact-checkers use when they check claims by politicians and political groups, a system called ClaimReview, and build something of a sequel. We called it MediaReview.
For the past nine months, Joel Luther, Erica Ryan and I have been talking with fact-checkers, representatives of the tech companies and other leaders in the battle against misinformation. Our ever-expanding group has come up with a great proposal and would love your feedback.
Like ClaimReview, MediaReview is schema – a tagging system that web publishers can use to identify different kinds of content. By tagging their articles, the publishers are essentially telling the world, “This is a fact-check on this politician on this particular claim.” That can be a valuable signal to tech companies, which can decide if they want to add labels to the original content or demote its standing in a feed, or do nothing. It’s up to them.
(Note: Google and Facebook have supported the work of The Reporters’ Lab and have given us grants to develop MediaReview.)
ClaimReview, which we developed with Google and Schema.org five years ago, has been a great success. It is used by more than half of the world’s fact-checkers and has been used to tag more than 50,000 articles. Those articles get highlighted in Google News and in search results on Google and YouTube.
We’re hopeful that MediaReview will be equally successful. By responding quickly to fake videos and bogus images, fact-checkers can provide the tech platforms with vital information about false content that might be going viral. The platforms can then decide if they want to take action.
The details are critical. We’ve based MediaReview on a taxonomy developed by the Washington Post. We’re still discussing the names of the labels, so feel free to make suggestions about the labels – or anything.
You can get a deeper understanding of MediaReview in this article in NiemanLab.
You can see screenshots of our current MediaReview form, including working labels and definitions here: Claim Only, Video, Image.
You can see our distillation of the current issues and add your comments here.
Update: 237 fact-checkers in nearly 80 countries … and counting

So far the Reporters' Lab list is up 26% over last's year annual tally.
By Mark Stencel and Joel Luther - April 3, 2020
Fact-checking has expanded to 78 countries, where the Duke Reporters’ Lab counts at least 237 organizations that actively verify the statements of public figures, track political promises and combat misinformation.
So far, that’s a 26% increase in the 10 months since the Reporters’ Lab published its 2019 fact-checking census. That was on eve of last summer’s annual Global Fact summit in South Africa, when our international database and map included 188 active fact-checkers in more than 60 countries.
We know that’s an undercount because we’re still counting. But here’s where we stand by continent:
Africa: 17
Asia: 53
Australia: 4
Europe: 68
North America: 69
South America: 26
About 20 fact-checkers listed in the database launched since last summer’s census. One of the newest launched just last week: FACTA, a spinoff of longtime Italian fact-checker Pagella Politica that will focus broadly on online hoaxes and disinformation.
The Lab’s next annual census will be published this summer, when the International Fact Checking Network hosts an online version of Global Fact. On Wednesday, the network postponed the in-person summit in Norway, scheduled for June, because of the coronavirus pandemic.
Several factors are driving the growth of fact-checking.
One is the increasing spread of misinformation on large digital media platforms, some of which are turning to fact-checkers for help — directly and indirectly. That includes a Facebook partnership that enlists participating “third-party” fact-checkers to help respond to some categories of misleading information flagged by its users. Another example is ClaimReview, an open-source tagging system the Reporters’ Lab helped develop that makes it easier for Google and other platforms to spotlight relevant fact-checks and contradict falsehoods. The Reporters’ Lab is developing a related new tagging-system, MediaReview, that will help flag manufactured and misleading use of images, including video and photos. (Disclosure: Facebook and Google are among the funders of the Lab, which develops and deploys technology to help fact-checkers. The Lab collaborated with Schema.org and Google to establish the ClaimReview framework and encourage its adoption.)
Another factor in the growth of fact-checking is the increasing role of collaboration. That includes fact-checking partnerships that involve competing news outlets and media groups that have banded together to share fact-checks or jointly cover political claims, especially during elections. It also includes growing collaboration within large media companies. Examples of those internal partnerships range from Agence France-Presse, the French news service that has established regional fact-checking sites with dedicated reporters in dozens of its bureaus around the world, to U.S.-based TEGNA, whose local TV stations produce and share “Verify” fact-checking segments across more than four dozen outlets.
Sharing content and processes is a positive thing — though it means it’s more difficult for our Lab to keep count. These multi-outlet fact-checking collaborations make it complicated for us to determine who exactly produces what, or to keep track of the individual outlets where readers, viewers and listeners can find this work. We’ll be clarifying our selection process to address that.
We’ll have more to say about the trends and trajectory of fact-checking in our annual census when the Global Fact summit convenes online. Working with a student researcher, Reporters’ Lab director Bill Adair first began tallying fact-checking projects for the first Global Fact summit in 2014. That gathering of about 50 people in London ultimately led a year later to the formation of the International Fact Checking Network, which is based at the Poynter Institute, a media studies and training center in St. Petersburg, Florida.
The IFCN summit itself has become a measure of fact-checkng’s growth. Before IFCN decided to turn this year’s in-person conference into an online event, more than 400 people had confirmed their participation. That would have been about eight times larger than the original London meeting in 2014.
IFCN director Baybars Örsek told fact-checkers Wednesday that the virtual summit will be scheduled in the coming weeks. Watch for our annual fact-checking census then.
Reporters’ Lab developing MediaReview, a new tool to combat fake videos and images

Standardizing how fact-checkers tag false videos and images should help search engines and social media companies identify misinformation more quickly.
By Catherine Clabby - January 27, 2020
Misleading, maliciously edited and other fake videos are on the rise around the world.
To help, the Duke Reporters’ Lab is leading a drive to create MediaReview, a new tagging system that will enable fact-checkers to be more consistent when they debunk false videos and images. It should help search engines and social media companies identify fakes more quickly and take prompt action to slow or stop them.
MediaReview is a schema similar to ClaimReview, a tagging system developed by the Reporters’ Lab, Google and Jigsaw that enables fact-checkers to better identify their articles for search engines and social media. MediaReview is based on a video-labeling vocabulary that Washington Post journalists created to describe misleading videos.

The new tagging system will allow fact-checkers to quickly label fake and manipulated videos and images with standardized tags such as “missing context,” “transformed,” “edited,” etc.
Bill Adair and Joel Luther are leading this project at the Reporters’ Lab. You can read about their work in a recent NeimanLab article and in their writing describing the project and why it’s needed:
MediaReview: Translating the video and visual fact-check terminology to Schema.org structured data
MediaReview case study: Gosar and Biden
Here’s more detail on the Washington Post’s pioneering system for labeling misinformation-bearing videos:
Introducing The Fact Checker’s guide to manipulated video
Last thing. If you don’t know much about the importance of ClaimReview, this should catch you up:
Lab launches global effort to expand ClaimReview.
U.S. fact-checkers gear up for 2020 campaign

Of the 226 fact-checking projects in the latest Reporters’ Lab global count, 50 are in the U.S. -- and most are locally focused.
By Mark Stencel and Joel Luther - November 25, 2019
With the U.S. election now less than a year away, at least four-dozen American fact-checking projects plan to keep tabs on claims by candidates and their supporters – and a majority of those fact-checkers won’t be focused on the presidential campaign.
The 50 active U.S. fact-checking projects are included in the latest Reporters’ Lab tally of global fact-checking, which now shows 226 sites in 73 countries. More details about the global growth below.
Of the 50 U.S. projects, about a third (16) are nationally focused. That includes independent fact-checkers such as FactCheck.org, PolitiFact and Snopes, as well as major news media efforts, including the Associated Press, The Washington Post, CNN and The New York Times. There also are a handful of fact-checkers that are less politically focused. They concentrate on global misinformation or specific topic areas, from science to gossip.
At least 31 others are state and locally minded fact-checkers spread across 20 states. Of that 31, 11 are PolitiFact’s state-level media partners. A new addition to that group is WRAL-TV in North Carolina — a commercial TV station that took over the PolitiFact franchise in its state from The News & Observer, a McClatchy-owned newspaper based in Raleigh. Beyond North Carolina, PolitiFact has active local affiliates in California, Florida, Illinois, Missouri, New York, Texas, Vermont, Virginia, West Virginia and Wisconsin.
The News & Observer has not abandoned fact-checking. It launched a new statewide initiative of its own — this time without PolitiFact’s trademarked Truth-O-Meter or a similar rating system for the statements it checks. “We’ll provide a highly informed assessment about the relative truth of the claims, rather than a static rating or ranking,” The N&O’s editors said in an article announcing its new project.
Among the 20 U.S. state and local fact-checkers that are not PolitiFact partners, at least 13 use some kind of rating system.
Of all the state and local fact-checkers, 11 are affiliated with TV stations — like WRAL, which had its own fact-checking service before it joined forces with PolitiFact this month. Another 11 are affiliated with newspapers or magazines. Five are local digital media startups and two are public radio stations. There are also a handful of projects based in academic journalism programs.
One example of a local digital startup is Mississippi Today, a non-profit state news service that launched a fact-checking page for last year’s election. It is among the projects we have added to our database over the past month.
We should note that some of these fact-checkers hibernate between election cycles. These seasonal fact-checkers that have long track records over multiple election cycles remain active in our database. Some have done this kind of reporting for years. For instance, WISC-TV in Madison, Wisconsin, has been fact-checking since 2004 — three years before PolitiFact, The Washington Post and AP got into the business.
One of the hardest fact-checking efforts for us to quantify is run by corporate media giant TEGNA Inc. which operates nearly 50 stations across the country. Its “Verify” segments began as a pilot project at WFAA-TV in the Dallas area in 2016. Now each station produces its own versions for its local TV and online audience. The topics are usually suggested by viewers, with local reporters often fact-checking political statements or debunking local hoaxes and rumors.
A reporter at WCNC-TV in Charlotte, North Carolina, also produces national segments that are distributed for use by any of the company’s other stations. We’ve added TEGNA’s “Verify” to our database as a single entry, but we may also add individual stations as we determine which ones do the kind of fact-checking we are trying to count. (Here’s how we decide which fact-checkers to include.)
A Global Movement
As for the global picture, the Reporters’ Lab is now up to 226 active fact-checking projects around the world — up from 210 in October, when our count went over 200 for the first time. That is more than five times the number we first counted in 2014. It’s also more than double a retroactive count for that same year –- a number that was based on the actual start dates of all the fact-checking projects we’ve added to the database over the past five years (see footnote to our most recent annual census for details).
The growth of Agence France-Presse’s work as part of Facebook’s third-party-fact checking partnership is a big factor. After adding a slew of AFP bureaus with dedicated fact-checkers to our database last month, we added many more — including Argentina, Brazil, Colombia, Mexico, Poland, Lebanon, Singapore, Spain, Thailand and Uruguay. We now count 22 individual AFP bureaus, all started since 2018.
Other recent additions to the database involved several established fact-checkers, including PesaCheck, which launched in Kenya in 2016. Since then it’s added bureaus in Tanzania in 2017 and Uganda in 2018 — both of which are now in our database. We added Da Begad, a volunteer effort based in Egypt that has focused on social media hoaxes and misinformation since 2013. And there’s a relative newcomer too: Re:Check, a Latvian project that’s affiliated with a non-profit investigative center called Re:Baltica. It launched over the summer.
Peru’s OjoBiónico is back on our active list. It resumed fact-checking last year after a two-year hiatus. OjoBiónico is a section of OjoPúblico, a digital news service that focuses on an investigative reporting service.
We already have other fact-checkers we plan to add to our database over the coming weeks. If there’s a fact-checker you know about that we need to update or add to our map, please contact Joel Luther at the Reporters’ Lab.
Reporters’ Lab fact-checking tally tops 200

With AFP's expansion and new election-focused projects, our ongoing global survey now includes 210 active fact-checkers.
By Mark Stencel and Joel Luther - October 21, 2019
The Reporters’ Lab added 21 fact-checkers to our database of reporting projects that regularly debunk political misinformation and viral hoaxes, pushing our global count over 200.
The database now lists 210 active fact-checkers in 68 countries. That nearly quintupled the number the Reporters’ Lab first counted in 2014. It also more than doubled a retroactive count for that same year – a number that was based on the actual start dates of all the fact-checking projects we’ve added to the database over the past five years (see footnote to our most recent annual census).
The rapid expansion of Agence France-Presse’s fact-checking in its news bureaus since 2018 was a big factor in reaching this milestone — including AFP’s dedicated editors in Hong Kong who coordinate fact-checkers there and across Asia. AFP attributes the growth to the support it receives from Facebook’s third-party fact-checking program. In addition to the Hong Kong bureau, our database now lists AFP fact-checkers in Australia, Canada, India, Indonesia, Kenya, Malaysia, Nigeria, Pakistan, Philippines, South Africa and Sri Lanka. At least seven of those bureaus began fact-checking in 2019. [Update: We missed a few other AFP bureaus that do fact-checking, which we’ll be adding in our November update.]
The database now lists several other recent additions that also launched in 2019, mainly to focus on upcoming elections. Bolivia Verifica launched in June, four months before this past weekend’s vote, which may be headed for a December runoff. Reverso in Argentina also launched in June, followed by Verificado Uruguay in July. The general elections in those two countries are this coming Sunday.
Other 2019 launches include Namibia FactCheck, GhanaFact and, in the United States, local TV station KCRG-TV’s in Cedar Rapids, Iowa. KCRG is a bit of a special case, since it’s hardly a newbie. The TV station was previously owned by a local newspaper, The Cedar Rapids Gazette. Even after the sale, the two newsrooms collaborated on fact-checking for several years through last year’s U.S. midterm elections. But now they have gone separate ways. Starting in March, the investigative reporting team at KCRG began doing its own fact-checking segments.
At least six other fact-checkers that launched in 2019 were already in our database before this month’s update, several of which were intentionally short-term projects that focused on specific elections. We’re checking on the status of those now. At least one, Global Edmonton’s Alberta Election Fact Check, is already on our inactive list. For that reason, we expect our count might not grow much more before the end of 2019 and might even drop slightly.
In addition to the projects that began in 2019, we also added three established fact-checkers to our database that were already in operation before this year: Local TV station KRIS-TV in Corpus Christi, Texas, has been on the fact-checking beat since 2017. The journalists who do fact-checking for Syria-focused Verify-Sy have worked from locations in Turkey, Europe and within that war-torn country since 2016. And Belgium’s Knack magazine has provided a fact-checking feature to its readers since 2012.
We weren’t sure we would cross the 200 fact-checkers milestone in October, since we also moved seven dormant projects to our separate count of inactive fact-checkers this month. Our count in September was 195 before we made this month’s updates.
If there’s a fact-checker you know about that we need to update or add to our database, please contact Joel Luther at the Reporters’ Lab. (Here’s how we decide which fact-checkers to include.)
From Toronto to New Delhi, fact-checkers find reinforcements

New additions to the Reporters' Lab fact-checking database push global count to 195.
By Mark Stencel and Joel Luther - September 16, 2019
The Duke Reporters’ Lab is adding seven fact-checkers from three continents to our global database. That puts our ongoing count of reporting projects that regularly debunk political misinformation and viral hoaxes close to 200.
With this month’s additions, the Lab’s database now counts 195 projects in 62 countries, including every project the International Fact-Checking Network has verified as signatories of its code of principles.
One new addition uses a name that’s inspired many others in the fact-checking community: the polygraph machine, also known as the lie detector. DELFI’s Melo Detektorius (“Lie Detector”) launched last November. It’s the fact-checker for the Lithuanian outlet of a commercial media company that operates digital news channels in the Baltic states and across Eastern Europe.
Many others have used variations of the name before, including the Columbian news site La Silla Vacía’s Detector de Mentiras and the Danish Broadcasting Corporation’s weekly political fact-checking TV program Detektor. There are versions of polygraph too, such as Polígrafo in Portugal and El Poligrafo, a fact-checker for the print edition of the Chilean newspaper El Mercurio. At least three inactive entries in our database used similar names.
The fact-checkers at Spondeo Media in Mexico City avoided the wording, but apparently liked the idea. Instead, they deploy a cartoon polygraph machine with emoji-like facial expressions to rate the accuracy of statements.
Two news sites associated with the TV Today Network in New Delhi and its corporate parent India Today are also recent additions to our database. In addition to the work that appears on India Today Fact Check, the company’s fact-checkers produce reports for the Hindi-language news channel Aaj Tak and the Bangla-language news and opinion portal DailyO. When claims circulate in multiple languages, fact-checks are translated and published across platforms.
“Broadly, the guiding principle for deciding the language of our fact- check story is the language in which the claim was made,” explained Balkrishna, who leads the Fact Check Team at the India Today Group. “If the claim is Hindi, we would write the fact check story in Hindi first. If the same claim appears in more than one language, we translate the stories and publish it on the respective websites.”
While it’s relatively common for fact-checkers in some countries to present their work in multiple languages on one site, it’s less common for one media company to produce fact-checks for multiple outlets in multiple languages.
As we approach a Canadian national election slated for Oct. 21, we are adding two fact-checkers from that part of the world. One is Décrypteurs from CBC/Radio-Canada in Montréal. It launched in May to focus on digital misinformation, particularly significant claims and posts that are flagged by its audience. But the format is not entirely new to the network, where reporter Jeff Yates had produced occasional fact-checks under the label “inspecteur viral.”
The Walrus magazine in Toronto is also focusing on digital misinformation on its fact-checking site, which launched in October 2018.
We have added two other well-established fact-checkers that have a similar focus. The first is the Thai News Agency’s Sure and Share Center in Bangkok. The Thai News Agency is the journalism arm of Mass Communication Organization of Thailand, a publicly traded state enterprise that was founded in 1952 and privatized in 2004.
The other is Fatabyyano, an independent fact-checker based in Amman, Jordan. It covers a wide range of misinformation and hoaxes throughout the Arab world, including nearly two dozen countries in the Middle East and North and East Africa. Applied Science Private University and the Zedni Education Network are among its supporters.
We learned that Fatabyyano’s name is a reference to a holy command from the Quran meaning “to investigate” from an article by former Reporter’s Lab student researcher Daniela Flamini. She wrote about that site and other fact-checking projects in the Arab world for the Poynter Institute’s International Fact-Checking Network.
Several of the sites Flamini mentioned are among a list of others we plan to add to our database when we post another of these updates in October.
Reporters’ Lab Launches Global Effort to Expand the Use of ClaimReview

At Global Fact 6 in Cape Town, the Lab launched an effort to help standardize the taging fact-checks.
By Joel Luther - July 17, 2019
The Duke Reporters’ Lab has launched a global effort to expand the use of ClaimReview, a standardized method of identifying fact-check articles for search engines and apps.
Funded by a grant from the Google News Initiative, The ClaimReview Project provides training and instructional materials about the use of ClaimReview for fact-checkers around the world.

ClaimReview was developed through a partnership of the Reporters’ Lab, Google, Jigsaw, and Schema.org. It provides a standard way for publishers of fact-checks to identify the claim being checked, the person or entity that made the claim, and the conclusion of the article. This standardization enables search engines and other platforms to highlight fact-checks, and can power automated products such as the FactStream and Squash apps being developed in the Reporters’ Lab.
“ClaimReview is the secret sauce of the future,” said Bill Adair, director of the Duke Reporters’ Lab. “It enables us to build apps and automate fact-checking in new and powerful ways.”
Slightly less than half of the 188 organizations included in our fact-checking database use ClaimReview.

At the Global Fact 6 conference in Cape Town, the Lab led two sessions designed to recruit and train new users. During a featured talk titled The Future of ClaimReview, the Lab introduced Google’s Fact Check Markup Tool, which makes it easier for journalists to create ClaimReview. They no longer have to embed code in their articles and can instead create ClaimReview by submitting a simple web form.
In an Intro to ClaimReview workshop later in the day, the Lab provided step-by-step assistance to fact-checkers using the tool for the first time.
The Lab also launched a website with a user guide and best practices, and will continue to work to expand the number of publishers using the tool.
A broken promise about a tattoo and the need to fact-check everyone

"When we put together the IFCN code of principles three years ago, we said that fact-checkers 'do not concentrate their fact-checking on any one side.'"
By Bill Adair - June 19, 2019
My opening remarks from Global Fact 6, Cape Town, South Africa, on June 19, 2019.
It’s wonderful to be here and see so many familiar faces. It’s particularly cool to see our new team from the IFCN, not just Baybars and Cris, but also Daniela Flamini, one of our journalism students from Duke who graduated last month and is now working for the IFCN.
And it warms my heart to see my old friend Stephen Buckley here. When Stephen was dean of the faculty at Poynter, the two of us organized the first Global Fact meeting in London in 2014. That wasn’t easy. We had difficulty raising enough money. But Stephen was determined to make it happen, so he found some money from a few different accounts at Poynter. Global Fact – and our important journalistic movement – would not have happened if it weren’t for him.
I’m impressed by this turnout – more than 250 attendees this year! I confess that when I saw the headline on Daniela’s story last week that said this was “the largest fact-checking event in history”… I wanted a fact-check. But I did one, and as PolitiFact would say, I rate that statement True!
I want to start today with a quick reminder of the importance of holding people accountable for what they say — in this case…me.
You will recall that last year at Global Fact, I promised that I would get a tattoo. And after some discussion, I decided it would be a tattoo of my beloved Truth-O-Meter. But a year went by and a funny thing happened: I decided I didn’t want a tattoo.
Now, as fact-checkers, we all know the importance of holding people accountable for what they say. We did that at PolitiFact with the Obameter and other campaign promise meters. PolitiFact has a special meter for a broken promise that usually features the politician with a big frown. We have fun choosing that photo, which has the person looking really miserable.
So I’ve created one to rate myself on the tattoo promise: The Bill-O-Meter. Promise broken!
My message today to open Global Fact is also about accountability. It’s about the need to make sure we fact-check all players in our political discourse.
Julianna Rennie and I recently wrote a piece for Poynter that looked at a new trend in the United States we call “embedded fact-checking.” It’s the growing practice of reporters including fact-checks in their news articles, when they drop in a paragraph or two that exposes a falsehood. For example, they may write that someone “falsely claimed that vaccines cause autism.”
We were glad to find a growing trend of embedded fact-checking in news and analysis articles in the New York Times, the Washington Post, and the AP over the past four years. But we also found the subject was nearly always the same: Donald Trump. It was wonderful to see the trend, but it was lopsided.
Trump is a prime target for fact-checking because his volume of falsehoods is unprecedented in American history — and probably in world history, too. Journalists rightly should question everything he says. And you may have similar figures in your own countries who deserve similar scrutiny.
But we shouldn’t focus so much on Trump that we neglect other politicians and other parties. That’s true not just in the United States but everywhere. Indeed, when we put together the IFCN code of principles three years ago, we said that fact-checkers “do not concentrate their fact-checking on any one side.”
In the United States and around the world, we need to make sure that we check all the important players in the political discourse, whether it is for news stories or our fact-checking sites.
So my message for you today is a simple one: check everybody. Hold everyone accountable.
Even me.