“Full Fact”

At Tech & Check, some new ideas to automate fact-checking

Journalists and technologists met at Duke to dream up new ways that technology can help fact-checkers.

By Bill Adair – April 4, 2016 | Print this article

Last week, journalists and technologists gathered at Duke to dream up new ways that automation could help fact-checking.

The first Tech & Check conference, sponsored by the Duke Reporters’ Lab and Poynter’s International Fact-Checking Network, brought together about 50 journalists, students and computer scientists. The goal was to showcase existing projects and inspire new ones.

Tech and Check photo
At Tech & Check, groups of students, journalists and technologists dreamed up new ideas to automate fact-checking.

The participants included representatives of Google, IBM, NBC News, PolitiFact, Full Fact, FactCheck.org and the WRAL-TV. From the academic side, we had faculty and Ph.D students from Duke, the University of North Carolina, University of Texas-Arlington, Indiana University and the University of Michigan.

The first day featured presentations about existing projects that automate some aspect of fact-checking; the second day, attendees formed groups to conceive new projects.

The presentations showcased a wide variety of tools and research projects. Will Moy of the British site Full Fact did a demo of his claim monitoring tool that tracks the frequency of talking points, showing how often politicians said the phrase over time. Naeemul Hassan of the University of Texas at Arlington showed ClaimBuster, a project I’ve worked on, that can ingest huge amounts of text and identify factual claims that journalists might want to check.

IBM’s Ben Fletcher showed one of the company’s new projects known as Watson Angles, a tool that extracts information from Web articles and distills it into a summary that includes key players and a timeline of events. Giovanni Luca Ciampaglia, a researcher at Indiana University, showed a project that uses Wikipedia to fact-check claims.

On the second day, we focused on the future. The attendees broke into groups to come up with new ideas for research. The groups had 75 minutes to create three ideas for tools or further research. The projects showed the many ways that automation can help fact-checking.

One promising idea was dubbed “Parrot Score,” a website that could build on the approach that Full Fact is exploring for claim monitoring. It would track the frequency of claims and then calculate a score for politicians who use canned phrases more often. Tyler Dukes, a data journalist from WRAL-TV in Raleigh, N.C., said Parrot Score could be a browser extension that showed the origin of a claim and then tracked it through the political ecosystem.

Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.
Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.

Two teams proposed variations of a “Check This First” button that would allow people to verify the accuracy of a URL before they post it on Facebook or in a chat. One team dubbed it “ChatBot.” Clicking it would bring up information that would help users determine if the article was reliable.

Another team was assigned to focus on ways to improve public trust in fact-checkers. The team came up with several interesting ideas, including more transparency about the collective ratings for individual writers and editors as well as a game app that would simulate the process that journalists use to fact-check a claim. The app could improve trust by giving people an opportunity to form their own conclusions as well as demonstrating the difficult work that fact-checkers do.

Another team, which was focused on fact-checker tools, came up with some interesting ideas for tools. One would automatically detect when the journalists were examining a claim they had checked before.  Another tool would be something of a “sentence finisher” that, when a journalist began typing something such as “The unemployment rate last month…” would finish the sentence with the correct number.

The conference left me quite optimistic about the potential for more collaboration between computer scientists and fact-checkers. Things that never seemed possible, such as checking claims against the massive Wikipedia database, are increasingly doable. And many technologists are interested in doing research and creating products to help fact-checking.

Back to top

Study explores new questions about quality of global fact-checking

The University of Wisconsin study examined fact-checks from Africa, India, Mexico, the United States, Uruguay and the United Kingdom.

By Bill Adair – August 11, 2015 | Print this article

How long should fact-checks be? How should they attribute their sources — with links or a detailed list? Should they provide a thorough account of a fact-checker’s work or distill it into a short summary?

Those are just a few of the areas explored in a fascinating new study by Lucas Graves, a journalism professor at the University of Wisconsin. He presented a summary of his research last month at the 2015 Global Fact-Checking Summit in London.

Lucas Graves
Lucas Graves

The pilot project represents the first in-depth qualitative analysis of global fact-checking. It was funded by the Omidyar Network as part of its grant to the Poynter Institute to create a new fact-checking organization. The study, done in conjunction with the Reporters’ Lab, lays the groundwork for a more extensive analysis of additional sites in the future.

The findings reveal that fact-checking is still a new form of journalism with few established customs or practices. Some fact-checkers write long articles with lots of quotes to back up their work. Others distill their findings into short articles without any quotes. Graves did not take a position on which approach is best, but his research gives fact-checkers some valuable data to begin discussions about how to improve their journalism.

Graves and three research assistants examined 10 fact-checking articles from each of six different sites: Africa Check, Full Fact in the United Kingdom, FactChecker.in in India, PolitiFact in the United States, El Sabueso in Mexico and UYCheck in Uruguay. The sites were chosen to reflect a wide range of global fact-checking, as this table shows:

Screen Shot 2015-08-11 at 3.26.38 PM
Click on the chart for more detail, then click browser “back” arrow to return to article.

Graves and his researchers found a surprising range in the length of the fact-checking articles. UYCheck from Uruguay had the longest articles, with an average word count of 1,148, followed by Africa Check at 1,009 and PolitiFact at 983.

The shortest were from Full Fact, which averaged just 354 words. They reflected a very different approach by the British team. Rather than lay out the factual claims and back them up with extensive quotes the way Screen Shot 2015-08-11 at 3.37.21 PMmost other sites do, the Full Fact approach is to distill them down to summaries.

Graves also found a wide range of data visualization in the articles sampled for each site. For example, Africa Check had three data visualizations in its 10 articles, while there were 11 in the Indian site FactChecker.in.

Graves found some sites used lots of data visualizations; others used relatively few.
Graves found some sites used lots of data visualizations; others used relatively few.

The Latin American sites UYCheck and El Sabueso used the most infographics, while the other sites relied more on charts and tables.

Graves also found a wide range in the use of web links and quotes. Africa Check averaged the highest total of web links and quotes per story (18), followed by 12 for PolitiFact, while UYCheck and El Sabueso had the fewest (8 and 5, respectively). Full Fact had no quotes in the 10 articles Graves examined but used an average of 9 links per article.

Graves and his researchers also examined how fact-checkers use links and quotes — whether they were used to provide political context about the claim being checked, to explain the subject being analyzed or to provide evidence about whether the claim was accurate. They found some sites, such as Africa Check and PolitiFact, used links more to provide context for the claim, while UYCheck and El Sabueso used them more for evidence in supporting a conclusion.

The analysis of quotes yielded some interesting results. PolitiFact used the most in the 10 articles — 38 quotes — with its largest share from evidentiary uses. Full Fact used the fewest (zero), followed by UYCheck (23) and El Sabueso (26).

The study also examined what Graves called “synthetic” sources — the different authoritative sources used to explain an issue and decide the accuracy of a claim. This part of the analysis distilled a final list of institutional sources for each fact-check, regardless of whether sources were directly quoted or linked to. AfricaCheck led the list with almost nine different authoritative sources considered on average, more than twice as many as FactChecker.in and UYCheck. Full Fact, UYCheck, and El Sabueso relied mainly on government agencies and data, while PolitiFact and Africa Check drew heavily on NGOs and academic experts in addition to official data.

The study raises some important questions for fact-checkers discuss. Are we writing are fact-checks too long? Too short?

Are we using enough data visualizations to help readers? Should we take the time to create more infographics instead of simple charts and tables?

What do we need to do to give our fact-checks authority? Are links sufficient? Or should we also include quotes from experts?

Over the next few months, we’ll have plenty to discuss.

Back to top

Voices from London: reflections on the Global Fact-Checking Summit

The fact-checkers of the world met at City University London to discuss the growth and challenges of their unique form of journalism.

By Bill Adair – July 28, 2015 | Print this article

One thing stood out at last week’s Global Fact-Checking Summit: the variety of the voices.

The conference, held at City University London, was in English, but the 60-plus participants had wonderful accents that showed the great diversity of fact-checking around the world: Irish, Russian, Spanish, Italian, German, Bosnian and Korean, among many others.

The second annual Global Fact-Checking Summit attracted more than 60 fact-checkers and academics to City University London.
The second annual Global Fact-Checking Summit attracted more than 60 fact-checkers and academics to City University London.

Reflecting the growth of fact-checking, the group included representatives of new sites that have started in the past year or will be starting soon. The new fact-checkers included Enda and Orna Young from FactCheckNI in Northern Ireland; Dana Wagner and Jacob Schroeder of FactsCan in Canada; and Damakant Jayshi, who is starting a site in Nepal.

The most significant news from the conference, announced last Friday, was that Omidyar Network and the National Endowment for Democracy have provided funding to the Poynter Institute to become the home of international fact-checking. Poynter will organize future conferences like this one, create training programs and establish a website. The website will be welcomed by the fact-checkers who said they need a place to discuss common problems and share best practices.

We began the conference with a video montage that captured the wide range of fact-checking segments on TV:

I was especially impressed by the TV segments from El Objetivo, a program on La Sexta in Spain, and the program Virus on Rai, the public television network in Italy. (U.S. networks could learn some lessons from the creative Spanish and Italian networks, which spend more time on production and do better graphics than their U.S. counterparts do.)

Our keynote speaker was Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart. One of Adam’s roles at the show is to be Stewart’s fact-checker, to ensure that even the best satire is grounded in fact.

“Chods,” as he is known at the show, played some funny clips and spiced them with comments about how he researches the segments. One of the clips was a Stewart interview with New York Mayor Bill de Blasio, when Stewart referred to a number that had been researched so well it was “Chods approved.”

Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart, addresses the conference. (Photo Chods approved.)
Adam Chodikoff, a senior producer at The Daily Show with Jon Stewart, addresses the conference. (Photo Chods approved.)

Adam is not a journalist in the traditional sense, but he showed how serious he is about research and fact-checking by attending all of the sessions in the two-day conference.

The conference featured a wide range of presentations that showcased interesting work being done around the world: the commitment to research and development by Chequeado in Argentina; a new PolitiFact browser extension that will allow readers to request fact-checks of a phrase and Pagella Politica’s efforts to earn revenue from the leftovers of its reporting.

One of the most popular sessions at the conference was the in-depth discussion about sustainability and revenue sources that Alexios Mantzarlis of Pagella Politica led on Friday. His interview with Ivana Cvetkovic Bajrovic of the National Endowment for Democracy provided great insights for fact-checkers seeking grants for their organizations. Laura Zommer from Chequeado and Mevan Babakar from Full Fact also provided some great tips on crowdfunding.

There were many other great sessions throughout conference, and I think everybody agreed the two days went by too fast. But I came away with a common theme: As we build our community, we’ll get the best ideas from each other.

That brings me back to the voices. There were some great individual voices with some marvelous accents. But as a community, we’re getting louder.

Back to top

Fact-Checking Census finds continued growth around the world

Our latest tally of fact-checking sites finds 30 new sites in places such as Turkey, Uruguay and South Korea.

By Bill Adair & Ishan Thakore – January 19, 2015 | Print this article

Fact-checking keeps growing around the world, with new sites in countries such as Turkey, Uruguay and South Korea.

The 2015 Fact-Checking Census from the Duke Reporters’ Lab found 89 that have been active in the past few years and 64 that are active today. That’s up from 59 total/44 active when we did our last count in May 2014. (We include inactive sites in our total count because sites come and go with election cycles. Some news organizations and journalism NGOs only fact-check during election years.)

Many of the additional sites have started in the last seven months, including UYCheck in Uruguay and Dogruluk Payi in Turkey. Others are sites that we didn’t find when we did our first count.

You can see the complete list on the fact-checking page of the Reporters’ Lab website, where you can browse by continent and country.

As with our last tally, the largest concentrations of fact-checking are in Europe and North America. We found 38 sites in Europe (including 27 active), 30 in North America (22 active) and seven in South America (five active). There are two new sites in South Korea.

The Truth or False Poll in South Korea enlists readers to help with fact-checking.
The Truth or False Poll in South Korea enlists readers to help with fact-checking.

The percentage of sites that use ratings continues to grow, up from about 70 percent in last year’s count to 80 percent today. Many rating systems use a true to false scale while others have devised more creative names. For example, ratings for the European site FactCheckEU include “Rather Daft” and “Insane Whopper.” Canada’s Baloney Meter rates statements from “No Baloney” to “Full of Baloney.”

We found that 56 of the 89 sites are affiliated with news organizations such as newspapers and television networks. The other 33 are sites that are dedicated to fact-checking such as FactCheck.org in the United States and Full Fact in Great Britain.

Almost one-third of the sites (29 of the 89) track the campaign promises of elected officials. Some, such as the Rouhani Meter for Iran’s President Hassan Rouhani, only track campaign promises. Others, such as PolitiFact in the United States, do promise-tracking in addition to fact-checking.

For more information about the Reporters’ Lab database, contact Bill Adair at  bill.adair@duke.edu

Back to top