Fact-checking keeps on growing.

Update: 237 fact-checkers in nearly 80 countries … and counting

So far the Reporters' Lab list is up 26% over last's year annual tally.

By Mark Stencel & Joel Luther – April 3, 2020 | Print this article

Fact-checking has expanded to 78 countries, where the Duke Reporters’ Lab counts at least 237 organizations that actively verify the statements of public figures, track political promises and combat misinformation.

So far, that’s a 26% increase in the 10 months since the Reporters’ Lab published its 2019 fact-checking census. That was on eve of last summer’s annual Global Fact summit in South Africa, when our international database and map included 188 active fact-checkers in more than 60 countries.

We know that’s an undercount because we’re still counting. But here’s where we stand by continent:

Africa: 17
Asia: 53
Australia: 4
Europe: 68
North America: 69
South America: 26

About 20 fact-checkers listed in the database launched since last summer’s census. One of the newest launched just last week: FACTA, a spinoff of longtime Italian fact-checker Pagella Politica that will focus broadly on online hoaxes and disinformation.

The Lab’s next annual census will be published this summer, when the International Fact Checking Network hosts an online version of Global Fact. On Wednesday, the network postponed the in-person summit in Norway, scheduled for June, because of the coronavirus pandemic.

Several factors are driving the growth of fact-checking. 

One is the increasing spread of misinformation on large digital media platforms, some of which are turning to fact-checkers for help — directly and indirectly. That includes a Facebook partnership that enlists participating “third-party” fact-checkers to help respond to some categories of misleading information flagged by its users. Another example is ClaimReview, an open-source tagging system the Reporters’ Lab helped develop that makes it easier for Google and other platforms to spotlight relevant fact-checks and contradict falsehoods. The Reporters’ Lab is developing a related new tagging-system, MediaReview, that will help flag manufactured and misleading use of images, including video and photos. (Disclosure: Facebook and Google are among the funders of the Lab, which develops and deploys technology to help fact-checkers. The Lab collaborated with Schema.org and Google to establish the ClaimReview framework and encourage its adoption.)

Another factor in the growth of fact-checking is the increasing role of collaboration. That includes fact-checking partnerships that involve competing news outlets and media groups that have banded together to share fact-checks or jointly cover political claims, especially during elections. It also includes growing collaboration within large media companies. Examples of those internal partnerships range from Agence France-Presse, the French news service that has established regional fact-checking sites with dedicated reporters in dozens of its bureaus around the world, to U.S.-based TEGNA, whose local TV stations produce and share “Verify” fact-checking segments across more than four dozen outlets.

Sharing content and processes is a positive thing — though it means it’s more difficult for our Lab to keep count. These multi-outlet fact-checking collaborations make it complicated for us to determine who exactly produces what, or to keep track of the individual outlets where readers, viewers and listeners can find this work. We’ll be clarifying our selection process to address that.

We’ll have more to say about the trends and trajectory of fact-checking in our annual census when the Global Fact summit convenes online. Working with a student researcher, Reporters’ Lab director Bill Adair first began tallying fact-checking projects for the first Global Fact summit in 2014. That gathering of about 50 people in London ultimately led a year later to the formation of the International Fact Checking Network, which is based at the Poynter Institute, a media studies and training center in St. Petersburg, Florida.

The IFCN summit itself has become a measure of fact-checkng’s growth. Before IFCN decided to turn this year’s in-person conference into an online event, more than 400 people had confirmed their participation. That would have been about eight times larger than the original London meeting in 2014.

IFCN director Baybars Örsek told fact-checkers Wednesday that the virtual summit will be scheduled in the coming weeks. Watch for our annual fact-checking census then.

Back to top

Squash report card: Improvements during State of the Union … and how humans will make our AI smarter

We've had some encouraging improvements in the AI powering our experimental fact-checking technology. But to make Squash smarter, we're calling in a human.

By Bill Adair – February 23, 2020 | Print this article

Squash, the experimental pop-up fact-checking product of the Reporters’ Lab, is getting better.

Our live test during the State of the Union address on Feb. 4 showed significant improvement over our inaugural test last year. Squash popped up 14 relevant fact-checks on the screen, up from just six last year.

That improvement matches a general trend we’ve seen in our testing. We’ve had a higher rate of relevant matches when we use Squash on videos of debates and speeches.

But we still have a long way to go. This month’s State of the Union speech also had 20 non-relevant matches, which means Squash displayed fact-checks that weren’t related to what the president said. If you’d been watching at that moment, you probably would have thought, “What is Squash thinking?”

We’re now going to try two ways to make Squash smarter: a new subject tagging system that will be based on a wonderfully addictive game developed by our lead technologist Chris Guess; and a new interface that will bring humans into the live decision-making. Squash will recommend fact-checks to display, but an editor will make the final judgment.

Some background in case you’re new to our project: Squash, part of the Lab’s Tech & Check Cooperative, is a revolutionary new product that displays fact-checks on a video screen during a debate or political speech. Squash “hears” what politicians say, converts their speech to text and then searches a database of previously published fact-checks for one that’s related. When Squash finds one, it displays a summary on the screen.

For our latest tests, we’ve been using Elasticsearch, a tool for building search engines that we’ve made smarter with two filters: ClaimBuster, an algorithm that identifies factual claims, and a large set of common synonyms. ClaimBuster helps Squash avoid wasting time and effort on sentences that aren’t factual claims, and the synonyms help it make better matches.

Guess, assisted by project manager Erica Ryan and student developers Jack Proudfoot and Sanha Lim, will soon be testing a new way of matching that uses natural language processing based on the subject of the fact-check. We believe that we’ll get more relevant matches if the matching is based on subjects rather than just the words in the politicians’ claims.

But to make that possible, we have to put subject tags on thousands of fact-checks in our ClaimReview database. So Guess has created a game called Caucus that displays a fact-check on your phone and then asks you to assign subject tags to it. The game is oddly addictive. Every time you submit one, you want to do another…and another. Guess has a leaderboard so we can keep track of who is tagging the most fact-checks. We’re testing the game with our students and staff, but hope to make it public soon.

We’ve also decided that Squash needs a little human help. Guess, working with our student developer Matt O’Boyle, is building an interface for human editors to control which matches actually pop up on users’ screens.

The new interface would let them review the fact-check that Squash recommends and decide whether to let it pop up on the screen, which should help us filter out most of the unrelated matches.

That should eliminate the slightly embarrassing problem when Squash makes a match that is comically bad. (My favorite: one from last year’s State of the Union when Squash matched the president’s line about men walking on the moon with a fact-check on how long it takes to get a permit to build a road.)

Assuming the new interface works relatively well, we’ll try to do a public demo of Squash this summer. 

Slowly but steadily, we are making progress. Watch for more improvements soon.

Back to top

Reporters’ Lab developing MediaReview, a new tool to combat fake videos and images

Standardizing how fact-checkers tag false videos and images should help search engines and social media companies identify misinformation more quickly.

By Catherine Clabby – January 27, 2020 | Print this article

Misleading, maliciously edited and other fake videos are on the rise around the world. 

To help, the Duke Reporters’ Lab is leading a drive to create MediaReview, a new tagging system that will enable fact-checkers to be more consistent when they debunk false videos and images. It should help search engines and social media companies identify fakes more quickly and take prompt action to slow or stop them.

MediaReview is a schema similar to ClaimReview, a tagging system developed by the Reporters’ Lab, Google and Jigsaw that enables fact-checkers to better identify their articles for search engines and social media. MediaReview is based on a video-labeling vocabulary that Washington Post journalists created to describe misleading videos. 

The spread of misinformation by video and images is growing worldwide. One recent example is a PragerU video on recent, devastating wildfires in Australia that overstates the role of arsonists and downplays links to climate change, according to FactCheck.org.

The new tagging system will allow fact-checkers to quickly label fake and manipulated videos and images with standardized tags such as “missing context,” “transformed,” “edited,” etc. 

Bill Adair and Joel Luther are leading this project at the Reporters’ Lab. You can read about their work in a recent NeimanLab article and in their writing describing the project and why it’s needed:

MediaReview wants to turn the mishmash vocabulary around manipulated photos and video into something structured 

MediaReview: Translating the video and visual fact-check terminology to Schema.org structured data

MediaReview case study: Gosar and Biden

Here’s more detail on the Washington Post’s pioneering system for labeling misinformation-bearing videos:

Introducing The Fact Checker’s guide to manipulated video

Last thing. If you don’t know much about the importance of ClaimReview, this should catch you up:

Lab launches global effort to expand ClaimReview.

Back to top

Fact-checking Database

U.S. fact-checkers gear up for 2020 campaign

Of the 226 fact-checking projects in the latest Reporters’ Lab global count, 50 are in the U.S. -- and most are locally focused.

By Mark Stencel & Joel Luther – November 25, 2019 | Print this article

With the U.S. election now less than a year away, at least four-dozen American fact-checking projects plan to keep tabs on claims by candidates and their supporters – and a majority of those fact-checkers won’t be focused on the presidential campaign.

The 50 active U.S. fact-checking projects are included in the latest Reporters’ Lab tally of global fact-checking, which now shows 226 sites in 73 countries. More details about the global growth below.

Of the 50 U.S. projects, about a third (16) are nationally focused. That includes independent fact-checkers such as FactCheck.org, PolitiFact and Snopes, as well as major news media efforts, including the Associated Press, The Washington Post, CNN and The New York Times. There also are a handful of fact-checkers that are less politically focused. They concentrate on global misinformation or specific topic areas, from science to gossip.

At least 31 others are state and locally minded fact-checkers spread across 20 states. Of that 31, 11 are PolitiFact’s state-level media partners. A new addition to that group is WRAL-TV in North Carolina — a commercial TV station that took over the PolitiFact franchise in its state from The News & Observer, a McClatchy-owned newspaper based in Raleigh. Beyond North Carolina, PolitiFact has active local affiliates in California, Florida, Illinois, Missouri, New York, Texas, Vermont, Virginia, West Virginia and Wisconsin.

The News & Observer has not abandoned fact-checking. It launched a new statewide initiative of its own — this time without PolitiFact’s trademarked Truth-O-Meter or a similar rating system for the statements it checks. “We’ll provide a highly informed assessment about the relative truth of the claims, rather than a static rating or ranking,” The N&O’s editors said in an article announcing its new project.

Among the 20 U.S. state and local fact-checkers that are not PolitiFact partners, at least 13 use some kind of rating system.

Of all the state and local fact-checkers, 11 are affiliated with TV stations — like WRAL, which had its own fact-checking service before it joined forces with PolitiFact this month. Another 11 are affiliated with newspapers or magazines. Five are local digital media startups and two are public radio stations. There are also a handful of projects based in academic journalism programs. 

One example of a local digital startup is Mississippi Today, a non-profit state news service that launched a fact-checking page for last year’s election. It is among the projects we have added to our database over the past month.

We should note that some of these fact-checkers hibernate between election cycles. These seasonal fact-checkers that have long track records over multiple election cycles remain active in our database. Some have done this kind of reporting for years. For instance, WISC-TV in Madison, Wisconsin, has been fact-checking since 2004 — three years before PolitiFact, The Washington Post and AP got into the business.

One of the hardest fact-checking efforts for us to quantify is run by corporate media giant TEGNA Inc. which operates nearly 50 stations across the country. Its “Verify” segments began as a pilot project at WFAA-TV in the Dallas area in 2016. Now each station produces its own versions for its local TV and online audience. The topics are usually suggested by viewers, with local reporters often fact-checking political statements or debunking local hoaxes and rumors. 

A reporter at WCNC-TV in Charlotte, North Carolina, also produces national segments that are distributed for use by any of the company’s other stations. We’ve added TEGNA’s “Verify” to our database as a single entry, but we may also add individual stations as we determine which ones do the kind of fact-checking we are trying to count. (Here’s how we decide which fact-checkers to include.)

A Global Movement

As for the global picture, the Reporters’ Lab is now up to 226 active fact-checking projects around the world — up from 210 in October, when our count went over 200 for the first time. That is more than five times the number we first counted in 2014. It’s also more than double a retroactive count for that same year –- a number that was based on the actual start dates of all the fact-checking projects we’ve added to the database over the past five years (see footnote to our most recent annual census for details).

The growth of Agence France-Presse’s work as part of Facebook’s third-party-fact checking partnership is a big factor. After adding a slew of AFP bureaus with dedicated fact-checkers to our database last month, we added many more — including Argentina, Brazil, Colombia, Mexico, Poland, Lebanon, Singapore, Spain, Thailand and Uruguay. We now count 22 individual AFP bureaus, all started since 2018.

Other recent additions to the database involved several established fact-checkers, including PesaCheck, which launched in Kenya in 2016. Since then it’s added bureaus in Tanzania in 2017 and Uganda in 2018 — both of which are now in our database. We added Da Begad, a volunteer effort based in Egypt that has focused on social media hoaxes and misinformation since 2013. And there’s a relative newcomer too: Re:Check, a Latvian project that’s affiliated with a non-profit investigative center called Re:Baltica. It launched over the summer. 

Peru’s OjoBiónico is back on our active list. It resumed fact-checking last year after a two-year hiatus. OjoBiónico is a section of OjoPúblico, a digital news service that focuses on an investigative reporting service.

We already have other fact-checkers we plan to add to our database over the coming weeks. If there’s a fact-checker you know about that we need to update or add to our map, please contact Joel Luther at the Reporters’ Lab.

Back to top

Beyond the Red Couch: Bringing UX Testing to Squash

As automated fact-checking gains ground, it's time to learn how to make pop-up content crystal clear on video screens.

By Andrew Donohue – October 28, 2019 | Print this article

Fact-checkers have a problem.

They want to use technology to hold politicians accountable by getting fact-checks in front of the public as quickly as possible. But they don’t yet know the best ways to make their content understood. At the Duke Reporters’ Lab, that’s where Jessica Mahone comes in.

Jessica Mahone is designing tests to help Duke Reporters’ Lab researchers figure out how to clearly share fact-checks live during broadcasts. Photo by Andrew Donohue

The Lab is developing Squash, a tool built to bring live fact-checking of politicians to TV. Mahone, a social scientist, was brought on board to design experiments and conduct user experience (UX) tests for Squash. 

UX design is the discipline focused on making new products easy to use. A clear UX design means that a product is intuitive and new users get it without a steep learning curve. 

“If people can’t understand your product or find it hard to use, then you are doomed from the start. With Squash, this means that we want people to comprehend the information and be able to quickly determine whether a claim is true or not,” Mahone said

For Squash, fact-check content that pops up on screens needs to be instantly understood since it will only be visible for a few seconds. So what’s the best way?

Bill Adair, the director of the Duke Tech & Check Cooperative, organized some preliminary testing last year that he dubbed the red couch experiments. The tests revealed more research was needed to understand the best way to inform viewers. 

“I originally thought that all it would take is a Truth-O-Meter popping up on screen,” Adair said. “Turns out it’s much more complicated than that.”

Sixteen people watched videos of Barack Obama and Donald Trump delivering State of the Union speeches while fact-checks of some of what they said appeared on the screen. Ratings were true, false or something in between. Blink, a company specializing in UX testing, found that participants loved the concept of real-time fact-checking and would welcome it on TV broadcasts. But the design of the pop-up fact-checks often confused them.

It’s not just the quality of content that counts. Viewers must understand what they see very quickly. Squash may one day share fact-checks during live events, including State of the Union addresses.

Some viewers didn’t understand the fact-check ratings such as true or false when they were displayed. Others assumed the presidents’ statements must be true if no fact-check was shown. That’s a problem because Squash doesn’t fact-check all claims in speeches. It displays published previously fact-checks for only the claims that match Squash’s finicky search algorithm. 

The red couch experiments were “a very basic test of the concept,” Mahone said. “What they found mainly is that there was a need to do more diving in and digging into the some questions about how people respond to this. Because it’s actually quite complex.”

Mahone has developed a new round of tests scheduled to begin this week. These tests will use Amazon Mechanical Turk, an online platform that relies on people who sign up to be paid research subjects.

“One thing that came out of the initial testing was that people don’t like to see a rating of a fact-check,” Mahone said. “I was a little skeptical of that. Most of the social science research says that people do prefer things like that because it makes it a lot easier for them to make decisions.”

In this next phase, Mahone will recruit about 500 subjects. A third will see a summary of a fact-check with a PolitiFact TRUE icon. Another third will see a summary with the just the label TRUE. The rest will see just a summary text of a fact-check.

Each viewer will rank how interested they are in using an automated fact-checking tool after viewing the different displays. Mahone will compare the results.

After finding out if including ratings works, Mahone and three undergraduate students, Dora Pekec, Javan Jiang and Jia Dua, will look at the bigger picture of Squash’s user experience. They will use a company to find about 20 people to talk to, ideally individuals who consistently watch TV news and are familiar with fact-checking.

Participants will be asked what features they would want in real-time fact-checking.

“The whole idea is to ask people ‘Hey, if you had access to a tool that could tell you if what someone on TV is saying is true or false, what would you want to see in that tool?’ ” Mahone said. “We want to figure out what people want and need out of Squash.”

Figuring out how to make Squash intuitive is critical to its success, according to Chris Guess, the Lab’s lead technologist. Part of the challenge is that Squash is something new and viewers have no experience with similar products.

“These days, people do a lot more than just watch a debate. They’re cooking dinner, playing on their phone, watching over the kids,” Guess said. “We want people to be able to tune in, see what’s going on, check out the automated fact-checks and then be able to tune out without missing anything.”

Reporters’ Lab researchers hope to have Squash up and running for the homestretch of the 2020 presidential campaign. Adair, Knight Professor of the Practice of Journalism and Public Policy at Duke, has begun reaching out to television executives to gauge their interest in an automated fact-checking tool. 

“TV networks are interested, but they want to wait and see a product that is more developed.” Adair said. 

 

Back to top

AFP Hong Kong

Reporters’ Lab fact-checking tally tops 200

With AFP's expansion and new election-focused projects, our ongoing global survey now includes 210 active fact-checkers.

By Mark Stencel & Joel Luther – October 21, 2019 | Print this article

The Reporters’ Lab added 21 fact-checkers to our database of reporting projects that regularly debunk political misinformation and viral hoaxes, pushing our global count over 200.

The database now lists 210 active fact-checkers in 68 countries. That nearly quintupled the number the Reporters’ Lab first counted in 2014. It also more than doubled a retroactive count for that same year – a number that was based on the actual start dates of all the fact-checking projects we’ve added to the database over the past five years (see footnote to our most recent annual census).

The rapid expansion of Agence France-Presse’s fact-checking in its news bureaus since 2018 was a big factor in reaching this milestone — including AFP’s dedicated editors in Hong Kong who coordinate fact-checkers there and across Asia. AFP attributes the growth to the support it receives from Facebook’s third-party fact-checking program. In addition to the Hong Kong bureau, our database now lists AFP fact-checkers in Australia, Canada, India, Indonesia, Kenya, Malaysia, Nigeria, Pakistan, Philippines, South Africa and Sri Lanka. At least seven of those bureaus began fact-checking in 2019. [Update: We missed a few other AFP bureaus that do fact-checking, which we’ll be adding in our November update.]

The database now lists several other recent additions that also launched in 2019, mainly to focus on upcoming elections. Bolivia Verifica launched in June, four months before this past weekend’s vote, which may be headed for a December runoff. Reverso in Argentina also launched in June, followed by Verificado Uruguay in July. The general elections in those two countries are this coming Sunday.

Other 2019 launches include Namibia FactCheck, GhanaFact and, in the United States, local TV station KCRG-TV’s in Cedar Rapids, Iowa. KCRG is a bit of a special case, since it’s hardly a newbie. The TV station was previously owned by a local newspaper, The Cedar Rapids Gazette. Even after the sale, the two newsrooms collaborated on fact-checking for several years through last year’s U.S. midterm elections. But now they have gone separate ways. Starting in March, the investigative reporting team at KCRG began doing its own fact-checking segments.

At least six other fact-checkers that launched in 2019 were already in our database before this month’s update, several of which were intentionally short-term projects that focused on specific elections. We’re checking on the status of those now. At least one, Global Edmonton’s Alberta Election Fact Check, is already on our inactive list. For that reason, we expect our count might not grow much more before the end of 2019 and might even drop slightly.

In addition to the projects that began in 2019, we also added three established fact-checkers to our database that were already in operation before this year: Local TV station KRIS-TV in Corpus Christi, Texas, has been on the fact-checking beat since 2017. The journalists who do fact-checking for Syria-focused Verify-Sy have worked from locations in Turkey, Europe and within that war-torn country since 2016. And Belgium’s Knack magazine has provided a fact-checking feature to its readers since 2012.

We weren’t sure we would cross the 200 fact-checkers milestone in October, since we also moved seven dormant projects to our separate count of inactive fact-checkers this month. Our count in September was 195 before we made this month’s updates.

If there’s a fact-checker you know about that we need to update or add to our database, please contact Joel Luther at the Reporters’ Lab. (Here’s how we decide which fact-checkers to include.)

Back to top

DELFI Melo Detektorius

From Toronto to New Delhi, fact-checkers find reinforcements

New additions to the Reporters' Lab fact-checking database push global count to 195.

By Mark Stencel & Joel Luther – September 16, 2019 | Print this article

The Duke Reporters’ Lab is adding seven fact-checkers from three continents to our global database. That puts our ongoing count of reporting projects that regularly debunk political misinformation and viral hoaxes close to 200.

With this month’s additions, the Lab’s database now counts 195 projects in 62 countries, including every project the International Fact-Checking Network has verified as signatories of its code of principles.

One new addition uses a name that’s inspired many others in the fact-checking community: the polygraph machine, also known as the lie detector. DELFI’s Melo Detektorius (“Lie Detector”) launched last November. It’s the fact-checker for the Lithuanian outlet of a commercial media company that operates digital news channels in the Baltic states and across Eastern Europe.

Many others have used variations of the name before, including the Columbian news site La Silla Vacía’s Detector de Mentiras and the Danish Broadcasting Corporation’s weekly political fact-checking TV program Detektor. There are versions of polygraph too, such as Polígrafo in Portugal and El Poligrafo, a fact-checker for the print edition of the Chilean newspaper El Mercurio. At least three inactive entries in our database used similar names.

The fact-checkers at Spondeo Media in Mexico City avoided the wording, but apparently liked the idea. Instead, they deploy a cartoon polygraph machine with emoji-like facial expressions to rate the accuracy of statements.

Two news sites associated with the TV Today Network in New Delhi and its corporate parent India Today are also recent additions to our database. In addition to the work that appears on India Today Fact Check, the company’s fact-checkers produce reports for the Hindi-language news channel Aaj Tak and the Bangla-language news and opinion portal DailyO. When claims circulate in multiple languages, fact-checks are translated and published across platforms.

“Broadly, the guiding principle for deciding the language of our fact- check story is the language in which the claim was made,” explained Balkrishna, who leads the Fact Check Team at the India Today Group. “If the claim is Hindi, we would write the fact check story in Hindi first. If the same claim appears in more than one language, we translate the stories and publish it on the respective websites.”

While it’s relatively common for fact-checkers in some countries to present their work in multiple languages on one site, it’s less common for one media company to produce fact-checks for multiple outlets in multiple languages.

As we approach a Canadian national election slated for Oct. 21, we are adding two fact-checkers from that part of the world. One is Décrypteurs from CBC/Radio-Canada in Montréal. It launched in May to focus on digital misinformation, particularly significant claims and posts that are flagged by its audience. But the format is not entirely new to the network, where reporter Jeff Yates had produced occasional fact-checks under the label “inspecteur viral.”

The Walrus magazine in Toronto is also focusing on digital misinformation on its fact-checking site, which launched in October 2018.

We have added two other well-established fact-checkers that have a similar focus. The first is the Thai News Agency’s Sure and Share Center in Bangkok. The Thai News Agency is the journalism arm of Mass Communication Organization of Thailand, a publicly traded state enterprise that was founded in 1952 and privatized in 2004.

The other is Fatabyyano, an independent fact-checker based in Amman, Jordan. It covers a wide range of misinformation and hoaxes throughout the Arab world, including nearly two dozen countries in the Middle East and North and East Africa. Applied Science Private University and the Zedni Education Network are among its supporters.

We learned that Fatabyyano’s name is a reference to a holy command from the Quran meaning “to investigate”  from an article by former Reporter’s Lab student researcher Daniela Flamini. She wrote about that site and other fact-checking projects in the Arab world for the Poynter Institute’s International Fact-Checking Network.

Several of the sites Flamini mentioned are among a list of others we plan to add to our database when we post another of these updates in October.

Back to top

Using artificial intelligence to expand fact-checking

Reporters' Lab projects are harnessing machine learning to assist fact-checking journalism

By Andrew Donohue – September 16, 2019 | Print this article

As news organizations adapt to the digital age, they’re turning to artificial intelligence to help human journalists produce the content consumers need. This is especially true in fact-checking.

Because politicians often repeat claims – even after they have been debunked – AI can help hold the politicians accountable by quickly finding relevant fact-checks. This technology can also search through vast amounts of content for fact-checkable claims, saving journalists time. 

“Fact-checking is uniquely suited to the use of AI,” said Bill Adair, director of the Duke Reporters’ Lab. 

The Reporters’ Lab uses AI to build tools like Squash, a system under development that fact-checks video of politicians as they speak. The goal is to display related fact-checks on viewers’ screens in a matter of seconds.

Squash listens to what politicians say and transcribes their words, making them searchable text. It then compares that text to
previously published fact-checks to look for matches. 

 “We’ve made some huge advancements in the past three years,” Adair said. “Squash has improved in accuracy since we demo’ed it at the State of the Union back in February.”

A screenshot of Squash, a fully automated fact-checking tool under development at The Reporters’ Lab.

This fall, the Squash team is refining its claim-matching technology. Its performance is inconsistent because people can make similar claims using different language.

Reporters’ Lab researchers hope to use more advanced machine learning techniques to help Squash become smarter at recognizing similar meaning even when the words don’t match. That will take time.

 “We’re dependent on technological processes improving,” Adair said. “Voice-to-text and matching algorithms are two big things we’re reliant on and those are continually improving, but still have a long way to go.” 

The Reporters’ Lab is also running user experience testing with Squash this fall to learn more about the most effective ways to display fact-checks on screens. Media researcher Jessica Mahone recently joined the lab to help develop a more effective user experience.   

 Squash could be the first step to a future where instant fact-checking is broadly available on broadcast TV, cable news and even web browsers, all thanks to the power of AI. Eventually viewers of all live political speeches and debates could benefit from Squash.

All of this is part of a larger movement within journalism starting to take advantage of AI’s possibilities. Outlets such as the Associated Press publish stories about sports and earning reports entirely written by computers. Xinhua, the Chinese’ state news agency, is experimenting with producing news broadcasts with virtual news anchors.

The Reporters’ Lab is one of the leading organizations in the world applying AI to fact-checking,  along with outlets FullFact in England and Chequeado in Argentina. The Lab’s Tech & Check Alerts, for instance, use AI to find and share checkable claims for fact-checking journalists around the country, so they do not have to spend time looking themselves. The Alerts have often shared claims that journalists have fact-checked.

It works like this: bots developed by Duke student researchers scrape Twitter posts and CNN transcripts daily to start the hunt for checkable claims. That content is fed to the ClaimBuster algorithm developed at the University of Texas, Arlington, which identifies potentially promising claims for fact-checkers. 

“Reading transcripts and watching TV looking for factual claims takes humans hours, but ClaimBuster can do it in seconds,” Adair said.

The Reporters’ Lab just last week debuted a new alert, The Best of the Bot, intended to flag the best of what the bots dig up. 

“We needed Best of the Bot because our Alerts had become so successful in finding claims that fact-checkers didn’t even have time to read them,” Adair said. “I think of it as a back-to-the-future approach. We now need a human to read the great work of the bot.”

Back to top

Reporters’ Lab Launches Global Effort to Expand the Use of ClaimReview

At Global Fact 6 in Cape Town, the Lab launched an effort to help standardize the taging fact-checks.

By Joel Luther – July 17, 2019 | Print this article

The Duke Reporters’ Lab has launched a global effort to expand the use of ClaimReview, a standardized method of identifying fact-check articles for search engines and apps.

Funded by a grant from the Google News Initiative, The ClaimReview Project provides training and instructional materials about the use of ClaimReview for fact-checkers around the world. 

photo of Bill presenting at Global Fact
Bill Adair at the Global Fact 6 conference

ClaimReview was developed through a partnership of the Reporters’ Lab, Google, Jigsaw, and Schema.org. It provides a standard way for publishers of fact-checks to identify the claim being checked, the person or entity that made the claim, and the conclusion of the article. This standardization enables search engines and other platforms to highlight fact-checks, and can power automated products such as the FactStream and Squash apps being developed in the Reporters’ Lab.

“ClaimReview is the secret sauce of the future,” said Bill Adair, director of the Duke Reporters’ Lab. “It enables us to build apps and automate fact-checking in new and powerful ways.”

Slightly less than half of the 188 organizations included in our fact-checking database use ClaimReview.

Joel Luther at a Global Fact workshop

At the Global Fact 6 conference in Cape Town, the Lab led two sessions designed to recruit and train new users. During a featured talk titled The Future of ClaimReview, the Lab introduced Google’s Fact Check Markup Tool, which makes it easier for journalists to create ClaimReview. They no longer have to embed code in their articles and can instead create ClaimReview by submitting a simple web form.

In an Intro to ClaimReview workshop later in the day, the Lab provided step-by-step assistance to fact-checkers using the tool for the first time. 

The Lab also launched a website with a user guide and best practices, and will continue to work to expand the number of publishers using the tool.

 

Back to top

A broken promise about a tattoo and the need to fact-check everyone

"When we put together the IFCN code of principles three years ago, we said that fact-checkers 'do not concentrate their fact-checking on any one side.'"

By Bill Adair – June 19, 2019 | Print this article

My opening remarks from Global Fact 6, Cape Town, South Africa, on June 19, 2019.

It’s wonderful to be here and see so many familiar faces. It’s particularly cool to see our new team from the IFCN, not just Baybars and Cris, but also Daniela Flamini, one of our journalism students from Duke who graduated last month and is now working for the IFCN.

And it warms my heart to see my old friend Stephen Buckley here. When Stephen was dean of the faculty at Poynter, the two of us organized the first Global Fact meeting in London in 2014. That wasn’t easy. We had difficulty raising enough money. But Stephen was determined to make it happen, so he found some money from a few different accounts at Poynter.  Global Fact – and our important journalistic movement – would not have happened if it weren’t for him.

I’m impressed by this turnout – more than 250 attendees this year! I confess that when I saw the headline on Daniela’s story last week that said this was “the largest fact-checking event in history”… I wanted a fact-check. But I did one, and as PolitiFact would say, I rate that statement True!

I want to start today with a quick reminder of the importance of holding people accountable for what they say — in this case…me.

You will recall that last year at Global Fact, I promised that I would get a tattoo. And after some discussion, I decided it would be a tattoo of my beloved Truth-O-Meter. But a year went by and a funny thing happened: I decided I didn’t want a tattoo.

Now, as fact-checkers, we all know the importance of holding people accountable for what they say. We did that at PolitiFact with the Obameter and other campaign promise meters. PolitiFact has a special meter for a broken promise that usually features the politician with a big frown. We have fun choosing that photo, which has the person looking really miserable.

So I’ve created one to rate myself on the tattoo promise: The Bill-O-Meter. Promise broken!

My message today to open Global Fact is also about accountability. It’s about the need to make sure we fact-check all players in our political discourse.

Julianna Rennie and I recently wrote a piece for Poynter that looked at a new trend in the United States we call “embedded fact-checking.” It’s the growing practice of reporters including fact-checks in their news articles, when they drop in a paragraph or two that exposes a falsehood. For example, they may write that someone “falsely claimed that vaccines cause autism.”

We were glad to find a growing trend of embedded fact-checking in news and analysis articles in the New York Times, the Washington Post, and the AP over the past four years. But we also found the subject was nearly always the same: Donald Trump. It was wonderful to see the trend, but it was lopsided.

Trump is a prime target for fact-checking because his volume of falsehoods is unprecedented in American history — and probably in world history, too. Journalists rightly should question everything he says. And you may have similar figures in your own countries who deserve similar scrutiny.

But we shouldn’t focus so much on Trump that we neglect other politicians and other parties. That’s true not just in the United States but everywhere. Indeed, when we put together the IFCN code of principles three years ago, we said that fact-checkers “do not concentrate their fact-checking on any one side.”

In the United States and around the world, we need to make sure that we check all the important players in the political discourse, whether it is for news stories or our fact-checking sites.

So my message for you today is a simple one: check everybody. Hold everyone accountable.

Even me.

Back to top