“Full Fact”

At Global Fact 4: churros, courage and the need to expose propagandists

The next challenge for the Global Fact community: calling out governments and political actors that pretend to be fact-checkers.

By Bill Adair – July 6, 2017 | Print this article

My opening remarks at Global Fact 4, the fourth annual meeting of the world’s fact-checkers, organized by the International Fact-Checking Network and the Reporters’ Lab, held July 5-7, 2017 in Madrid, Spain.

It’s wonderful to be here in Madrid. I’ve been enjoying the city the last two days, which has made me think of a giant warehouse store we have in the United States called Costco.

Costco where you go when you want to buy 10 pounds of American Cheese or a 6-pound tub of potato salad. Costco also makes a delicious fried pastry called a “churro.” And because everything in Costco is big, the churros are about three feet long.

When I got to Madrid I was really glad to see that you have churros here, too! It’s wonderful to see that Costco is spreading its great cuisine around the world!

I’m pleased to be here with my colleagues from the Duke Reporters’ Lab — Mark Stencel, Rebecca Iannucci and Riley Griffin. We also have our Share the Facts team here – Chris Guess and Erica Ryan. We’ll be sampling the churros throughout the week!

It’s been an amazing year for fact-checking. In the U.K., Full Fact and Channel 4 mobilized for Brexit and last month’s parliamentary elections. In France, the First Draft coalition showed the power of collaborations during the elections there. In the United States, the new president and his administration drove record traffic to sites such as FactCheck.org and PolitiFact and the Washington Post Fact Checker — and that has continued since the election, a time when sites typically have lower traffic. The impeachments and political scandals in Brazil and South Korea also meant big audiences for fact-checkers in those countries. And we expect the upcoming elections in Germany, Norway and elsewhere will generate many opportunities for fact-checkers in those countries as well, just as we’ve seen in Turkey and Iran. The popular demand for fact-checking has never been stronger.

Fact-checking is now so well known that it is part of pop culture. Comedians cite our work to give their jokes credibility. On Saturday Night Live last fall, Australian actress Margot Robbie “fact-checked” her opening monologue when she was the guest host.

Some news organizations not only have their own dedicated fact-checking teams, they’re also incorporating fact-checks in their news stories, calling out falsehoods at the moment they are uttered. This is a marvelous development because it helps to debunk falsehoods before they can take root.

We’ve also seen tremendous progress in automation to spread fact-checking to new audiences. There are promising projects underway at Full Fact in Britain and at the University of Texas in Arlington and in our own lab at Duke, among many others. We’ll be talking a lot about these projects this week.

Perhaps the most important development in the past year is one that we started at last year’s Global Fact conference in Buenos Aires – the Code of Principles. We came up with some excellent principles that set standards for transparency and non-partisan work. As Alexios noted, Facebook is using the code to determine which organizations qualify to debunk fake news. I hope your site will abide by the code and become a signatory.

At Duke, Mark just finished our annual summer count of fact-checking. Mark and Alexios like to tease me that I can’t stop repeating this mantra: “Fact-checking keeps growing.”

But it’s become my mantra because it’s true: When we held our first Global Fact meeting in 2014 in London, our Reporters’ Lab database listed 48 fact-checking sites around the world. Our latest count shows 126 active projects in 49 countries.

I’m thrilled to see fact-checking sprouting in countries such as South Korea and Germany and Brazil. And I continue to be amazed at the courage of our colleagues who check claims in Turkey and Iran, which are not very welcoming to our unique kind of journalism.

As our movement grows, we face new challenges. Now that our work is so well-known and an established form of journalism, governments and political actors are calling themselves fact-checkers, using our approach to produce propaganda. We need to speak out against this and make sure people know that government propagandists are not fact-checkers.

We also need to work harder to reach audiences that have been reluctant to accept our work. At Duke we published a study that showed a stark partisan divide in the United States. We found liberal publications loved fact-checking and often cited it; conservative sites criticized it and often belittled it. We need to focus on this problem and find new ways to reach reluctant audiences.

I’m confident we can accomplish these things. Individually and together we’ve overcome great hurdles in the past few years. I look forward to a productive meeting and a great year. And I’m confident:

Fact-checking will keep growing.

 

 

 

 

Back to top

At Tech & Check, some new ideas to automate fact-checking

Journalists and technologists met at Duke to dream up new ways that technology can help fact-checkers.

By Bill Adair – April 4, 2016 | Print this article

Last week, journalists and technologists gathered at Duke to dream up new ways that automation could help fact-checking.

The first Tech & Check conference, sponsored by the Duke Reporters’ Lab and Poynter’s International Fact-Checking Network, brought together about 50 journalists, students and computer scientists. The goal was to showcase existing projects and inspire new ones.

Tech and Check photo
At Tech & Check, groups of students, journalists and technologists dreamed up new ideas to automate fact-checking.

The participants included representatives of Google, IBM, NBC News, PolitiFact, Full Fact, FactCheck.org and the WRAL-TV. From the academic side, we had faculty and Ph.D students from Duke, the University of North Carolina, University of Texas-Arlington, Indiana University and the University of Michigan.

The first day featured presentations about existing projects that automate some aspect of fact-checking; the second day, attendees formed groups to conceive new projects.

The presentations showcased a wide variety of tools and research projects. Will Moy of the British site Full Fact did a demo of his claim monitoring tool that tracks the frequency of talking points, showing how often politicians said the phrase over time. Naeemul Hassan of the University of Texas at Arlington showed ClaimBuster, a project I’ve worked on, that can ingest huge amounts of text and identify factual claims that journalists might want to check.

IBM’s Ben Fletcher showed one of the company’s new projects known as Watson Angles, a tool that extracts information from Web articles and distills it into a summary that includes key players and a timeline of events. Giovanni Luca Ciampaglia, a researcher at Indiana University, showed a project that uses Wikipedia to fact-check claims.

On the second day, we focused on the future. The attendees broke into groups to come up with new ideas for research. The groups had 75 minutes to create three ideas for tools or further research. The projects showed the many ways that automation can help fact-checking.

One promising idea was dubbed “Parrot Score,” a website that could build on the approach that Full Fact is exploring for claim monitoring. It would track the frequency of claims and then calculate a score for politicians who use canned phrases more often. Tyler Dukes, a data journalist from WRAL-TV in Raleigh, N.C., said Parrot Score could be a browser extension that showed the origin of a claim and then tracked it through the political ecosystem.

Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.
Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.

Two teams proposed variations of a “Check This First” button that would allow people to verify the accuracy of a URL before they post it on Facebook or in a chat. One team dubbed it “ChatBot.” Clicking it would bring up information that would help users determine if the article was reliable.

Another team was assigned to focus on ways to improve public trust in fact-checkers. The team came up with several interesting ideas, including more transparency about the collective ratings for individual writers and editors as well as a game app that would simulate the process that journalists use to fact-check a claim. The app could improve trust by giving people an opportunity to form their own conclusions as well as demonstrating the difficult work that fact-checkers do.

Another team, which was focused on fact-checker tools, came up with some interesting ideas for tools. One would automatically detect when the journalists were examining a claim they had checked before.  Another tool would be something of a “sentence finisher” that, when a journalist began typing something such as “The unemployment rate last month…” would finish the sentence with the correct number.

The conference left me quite optimistic about the potential for more collaboration between computer scientists and fact-checkers. Things that never seemed possible, such as checking claims against the massive Wikipedia database, are increasingly doable. And many technologists are interested in doing research and creating products to help fact-checking.

Back to top

Study explores new questions about quality of global fact-checking

The University of Wisconsin study examined fact-checks from Africa, India, Mexico, the United States, Uruguay and the United Kingdom.

By Bill Adair – August 11, 2015 | Print this article

How long should fact-checks be? How should they attribute their sources — with links or a detailed list? Should they provide a thorough account of a fact-checker’s work or distill it into a short summary?

Those are just a few of the areas explored in a fascinating new study by Lucas Graves, a journalism professor at the University of Wisconsin. He presented a summary of his research last month at the 2015 Global Fact-Checking Summit in London.

Lucas Graves
Lucas Graves

The pilot project represents the first in-depth qualitative analysis of global fact-checking. It was funded by the Omidyar Network as part of its grant to the Poynter Institute to create a new fact-checking organization. The study, done in conjunction with the Reporters’ Lab, lays the groundwork for a more extensive analysis of additional sites in the future.

The findings reveal that fact-checking is still a new form of journalism with few established customs or practices. Some fact-checkers write long articles with lots of quotes to back up their work. Others distill their findings into short articles without any quotes. Graves did not take a position on which approach is best, but his research gives fact-checkers some valuable data to begin discussions about how to improve their journalism.

Graves and three research assistants examined 10 fact-checking articles from each of six different sites: Africa Check, Full Fact in the United Kingdom, FactChecker.in in India, PolitiFact in the United States, El Sabueso in Mexico and UYCheck in Uruguay. The sites were chosen to reflect a wide range of global fact-checking, as this table shows:

Screen Shot 2015-08-11 at 3.26.38 PM
Click on the chart for more detail, then click browser “back” arrow to return to article.

Graves and his researchers found a surprising range in the length of the fact-checking articles. UYCheck from Uruguay had the longest articles, with an average word count of 1,148, followed by Africa Check at 1,009 and PolitiFact at 983.

The shortest were from Full Fact, which averaged just 354 words. They reflected a very different approach by the British team. Rather than lay out the factual claims and back them up with extensive quotes the way Screen Shot 2015-08-11 at 3.37.21 PMmost other sites do, the Full Fact approach is to distill them down to summaries.

Graves also found a wide range of data visualization in the articles sampled for each site. For example, Africa Check had three data visualizations in its 10 articles, while there were 11 in the Indian site FactChecker.in.

Graves found some sites used lots of data visualizations; others used relatively few.
Graves found some sites used lots of data visualizations; others used relatively few.

The Latin American sites UYCheck and El Sabueso used the most infographics, while the other sites relied more on charts and tables.

Graves also found a wide range in the use of web links and quotes. Africa Check averaged the highest total of web links and quotes per story (18), followed by 12 for PolitiFact, while UYCheck and El Sabueso had the fewest (8 and 5, respectively). Full Fact had no quotes in the 10 articles Graves examined but used an average of 9 links per article.

Graves and his researchers also examined how fact-checkers use links and quotes — whether they were used to provide political context about the claim being checked, to explain the subject being analyzed or to provide evidence about whether the claim was accurate. They found some sites, such as Africa Check and PolitiFact, used links more to provide context for the claim, while UYCheck and El Sabueso used them more for evidence in supporting a conclusion.

The analysis of quotes yielded some interesting results. PolitiFact used the most in the 10 articles — 38 quotes — with its largest share from evidentiary uses. Full Fact used the fewest (zero), followed by UYCheck (23) and El Sabueso (26).

The study also examined what Graves called “synthetic” sources — the different authoritative sources used to explain an issue and decide the accuracy of a claim. This part of the analysis distilled a final list of institutional sources for each fact-check, regardless of whether sources were directly quoted or linked to. AfricaCheck led the list with almost nine different authoritative sources considered on average, more than twice as many as FactChecker.in and UYCheck. Full Fact, UYCheck, and El Sabueso relied mainly on government agencies and data, while PolitiFact and Africa Check drew heavily on NGOs and academic experts in addition to official data.

The study raises some important questions for fact-checkers discuss. Are we writing are fact-checks too long? Too short?

Are we using enough data visualizations to help readers? Should we take the time to create more infographics instead of simple charts and tables?

What do we need to do to give our fact-checks authority? Are links sufficient? Or should we also include quotes from experts?

Over the next few months, we’ll have plenty to discuss.

Back to top

Voices from London: reflections on the Global Fact-Checking Summit

The fact-checkers of the world met at City University London to discuss the growth and challenges of their unique form of journalism.

By Bill Adair – July 28, 2015 | Print this article

One thing stood out at last week’s Global Fact-Checking Summit: the variety of the voices.

The conference, held at City University London, was in English, but the 60-plus participants had wonderful accents that showed the great diversity of fact-checking around the world: Irish, Russian, Spanish, Italian, German, Bosnian and Korean, among many others.

The second annual Global Fact-Checking Summit attracted more than 60 fact-checkers and academics to City University London.
The second annual Global Fact-Checking Summit attracted more than 60 fact-checkers and academics to City University London.

Reflecting the growth of fact-checking, the group included representatives of new sites that have started in the past year or will be starting soon. The new fact-checkers included Enda and Orna Young from FactCheckNI in Northern Ireland; Dana Wagner and Jacob Schroeder of FactsCan in Canada; and Damakant Jayshi, who is starting a site in Nepal.

The most significant news from the conference, announced last Friday, was that Omidyar Network and the National Endowment for Democracy have provided funding to the Poynter Institute to become the home of international fact-checking. Poynter will organize future conferences like this one, create training programs and establish a website. The website will be welcomed by the fact-checkers who said they need a place to discuss common problems and share best practices.

We began the conference with a video montage that captured the wide range of fact-checking segments on TV:

I was especially impressed by the TV segments from El Objetivo, a program on La Sexta in Spain, and the program Virus on Rai, the public television network in Italy. (U.S. networks could learn some lessons from the creative Spanish and Italian networks, which spend more time on production and do better graphics than their U.S. counterparts do.)

Our keynote speaker was Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart. One of Adam’s roles at the show is to be Stewart’s fact-checker, to ensure that even the best satire is grounded in fact.

“Chods,” as he is known at the show, played some funny clips and spiced them with comments about how he researches the segments. One of the clips was a Stewart interview with New York Mayor Bill de Blasio, when Stewart referred to a number that had been researched so well it was “Chods approved.”

Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart, addresses the conference. (Photo Chods approved.)
Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart, addresses the conference. (Photo Chods approved.)

Adam is not a journalist in the traditional sense, but he showed how serious he is about research and fact-checking by attending all of the sessions in the two-day conference.

The conference featured a wide range of presentations that showcased interesting work being done around the world: the commitment to research and development by Chequeado in Argentina; a new PolitiFact browser extension that will allow readers to request fact-checks of a phrase and Pagella Politica’s efforts to earn revenue from the leftovers of its reporting.

One of the most popular sessions at the conference was the in-depth discussion about sustainability and revenue sources that Alexios Mantzarlis of Pagella Politica led on Friday. His interview with Ivana Cvetkovic Bajrovic of the National Endowment for Democracy provided great insights for fact-checkers seeking grants for their organizations. Laura Zommer from Chequeado and Mevan Babakar from Full Fact also provided some great tips on crowdfunding.

There were many other great sessions throughout conference, and I think everybody agreed the two days went by too fast. But I came away with a common theme: As we build our community, we’ll get the best ideas from each other.

That brings me back to the voices. There were some great individual voices with some marvelous accents. But as a community, we’re getting louder.

Back to top

Fact-Checking Census finds continued growth around the world

Our latest tally of fact-checking sites finds 30 new sites in places such as Turkey, Uruguay and South Korea.

By Bill Adair & Ishan Thakore – January 19, 2015 | Print this article

Fact-checking keeps growing around the world, with new sites in countries such as Turkey, Uruguay and South Korea.

The 2015 Fact-Checking Census from the Duke Reporters’ Lab found 89 that have been active in the past few years and 64 that are active today. That’s up from 59 total/44 active when we did our last count in May 2014. (We include inactive sites in our total count because sites come and go with election cycles. Some news organizations and journalism NGOs only fact-check during election years.)

Many of the additional sites have started in the last seven months, including UYCheck in Uruguay and Dogruluk Payi in Turkey. Others are sites that we didn’t find when we did our first count.

You can see the complete list on the fact-checking page of the Reporters’ Lab website, where you can browse by continent and country.

As with our last tally, the largest concentrations of fact-checking are in Europe and North America. We found 38 sites in Europe (including 27 active), 30 in North America (22 active) and seven in South America (five active). There are two new sites in South Korea.

The Truth or False Poll in South Korea enlists readers to help with fact-checking.
The Truth or False Poll in South Korea enlists readers to help with fact-checking.

The percentage of sites that use ratings continues to grow, up from about 70 percent in last year’s count to 80 percent today. Many rating systems use a true to false scale while others have devised more creative names. For example, ratings for the European site FactCheckEU include “Rather Daft” and “Insane Whopper.” Canada’s Baloney Meter rates statements from “No Baloney” to “Full of Baloney.”

We found that 56 of the 89 sites are affiliated with news organizations such as newspapers and television networks. The other 33 are sites that are dedicated to fact-checking such as FactCheck.org in the United States and Full Fact in Great Britain.

Almost one-third of the sites (29 of the 89) track the campaign promises of elected officials. Some, such as the Rouhani Meter for Iran’s President Hassan Rouhani, only track campaign promises. Others, such as PolitiFact in the United States, do promise-tracking in addition to fact-checking.

For more information about the Reporters’ Lab database, contact Bill Adair at  bill.adair@duke.edu

Back to top