At Tech & Check, some new ideas to automate fact-checking

Journalists and technologists met at Duke to dream up new ways that technology can help fact-checkers.

By Bill Adair – April 4, 2016 | Print this article

Last week, journalists and technologists gathered at Duke to dream up new ways that automation could help fact-checking.

The first Tech & Check conference, sponsored by the Duke Reporters’ Lab and Poynter’s International Fact-Checking Network, brought together about 50 journalists, students and computer scientists. The goal was to showcase existing projects and inspire new ones.

Tech and Check photo
At Tech & Check, groups of students, journalists and technologists dreamed up new ideas to automate fact-checking.

The participants included representatives of Google, IBM, NBC News, PolitiFact, Full Fact, FactCheck.org and the WRAL-TV. From the academic side, we had faculty and Ph.D students from Duke, the University of North Carolina, University of Texas-Arlington, Indiana University and the University of Michigan.

The first day featured presentations about existing projects that automate some aspect of fact-checking; the second day, attendees formed groups to conceive new projects.

The presentations showcased a wide variety of tools and research projects. Will Moy of the British site Full Fact did a demo of his claim monitoring tool that tracks the frequency of talking points, showing how often politicians said the phrase over time. Naeemul Hassan of the University of Texas at Arlington showed ClaimBuster, a project I’ve worked on, that can ingest huge amounts of text and identify factual claims that journalists might want to check.

IBM’s Ben Fletcher showed one of the company’s new projects known as Watson Angles, a tool that extracts information from Web articles and distills it into a summary that includes key players and a timeline of events. Giovanni Luca Ciampaglia, a researcher at Indiana University, showed a project that uses Wikipedia to fact-check claims.

On the second day, we focused on the future. The attendees broke into groups to come up with new ideas for research. The groups had 75 minutes to create three ideas for tools or further research. The projects showed the many ways that automation can help fact-checking.

One promising idea was dubbed “Parrot Score,” a website that could build on the approach that Full Fact is exploring for claim monitoring. It would track the frequency of claims and then calculate a score for politicians who use canned phrases more often. Tyler Dukes, a data journalist from WRAL-TV in Raleigh, N.C., said Parrot Score could be a browser extension that showed the origin of a claim and then tracked it through the political ecosystem.

Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.
Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.

Two teams proposed variations of a “Check This First” button that would allow people to verify the accuracy of a URL before they post it on Facebook or in a chat. One team dubbed it “ChatBot.” Clicking it would bring up information that would help users determine if the article was reliable.

Another team was assigned to focus on ways to improve public trust in fact-checkers. The team came up with several interesting ideas, including more transparency about the collective ratings for individual writers and editors as well as a game app that would simulate the process that journalists use to fact-check a claim. The app could improve trust by giving people an opportunity to form their own conclusions as well as demonstrating the difficult work that fact-checkers do.

Another team, which was focused on fact-checker tools, came up with some interesting ideas for tools. One would automatically detect when the journalists were examining a claim they had checked before.  Another tool would be something of a “sentence finisher” that, when a journalist began typing something such as “The unemployment rate last month…” would finish the sentence with the correct number.

The conference left me quite optimistic about the potential for more collaboration between computer scientists and fact-checkers. Things that never seemed possible, such as checking claims against the massive Wikipedia database, are increasingly doable. And many technologists are interested in doing research and creating products to help fact-checking.

Back to top

Reporters’ Lab, IFCN to host conference about automated fact-checking

The March 31-April 1 conference will showcase new research to use computational power to help fact-checkers.

By Alexios Mantzarlis – January 21, 2016 | Print this article

The Reporters’ Lab and Poynter’s International Fact-Checking Network  will host “Tech & Check”, the first conference to explore the promise and challenges of automated fact-checking.

Tech & Check, to be held March 31-April 1 at Duke University, will bring together experts from academia, journalism and the tech industry. The conference will include:

  1. Demos and presentations of current research that automates fact-checking
  2. Discussions about the institutional challenges of expanding the automated work
  3. Discussions on new areas for exploration, such as live fact-checking and automated annotation.

Research in computational fact-checking has been underway for several years, but has picked up momentum with a flurry of new projects.

While automating fact-checking entirely is still the stuff of science fiction, parts of the fact-checking process such as gathering fact-checkable claims or matching them with articles already published seem ripe for automation. As natural language processing (NLP) and other artificial intelligence tools become more sophisticated, the potential applications for fact-checking will increase.

Indeed, around the world several projects are exploring ways to make fact-checking faster and smarter through the use of technology. For example, at Duke University, an NSF-funded project uses computational power to help fact-checkers verify common claims about the voting records of members of Congress. The University of Texas-Arlington has developed a tool called ClaimBuster that can analyze long transcripts of debates and suggest sentences that could be fact-checked. At Indiana University, researchers have experimented with a tool that uses Wikipedia and knowledge networks to verify simple statements. Fact-checkers in France, Argentina, the U.K. and Italy are also doing work in this field.

The conference is made possible with support by, among others, the Park Foundation. More details will be published in the coming weeks.

Researchers and journalists interested in attending the conference should contact the International Fact-Checking Network at factchecknet@poynter.org

Back to top

Reporters’ Lab projects featured at Computation + Journalism conference

The Reporters' Lab projects on structured journalism and fact-checking were featured at the annual conference.

By Julia Donheiser – October 6, 2015 | Print this article

Two projects from the Duke Reporters’ Lab were featured at the 2015 Computation + Journalism Symposium, which was held over the weekend at Columbia University in New York.

The two-day conference included presentations about Structured Stories NYC, an experiment that involved three Duke students covering events in New York, and a separate project that is exploring new ways to automate fact-checking.

Structured Stories, which uses a unique structured journalism approach to local news, was the topic of a presentation by David Caswell, a fellow at the Reynolds Journalism Institute.

Caswell explained Structured Stories in a presentation titled the Editorial Aspects of Reporting into Structured Narratives.

Structured Stories NYC is one of the boldest experiments of structured journalism because it dices the news into short events that can be reassembled in different ways by readers. The site is designed to put readers in charge by allowing them to adjust the depth of story coverage.

On the second day of the conference, Reporters’ Lab Director Bill Adair and Naeemul Hassan, a Ph.D. student in computer science at the University of Texas-Arlington, made a presentation that Adair said was “a call to arms” to automate fact-checking. It was based on a paper called The Quest to Automate Fact-Checking that they co-authored with Chengkai Li and Mark Tremayne of the University of Texas-Arlington, Jun Yang of Duke, James Hamilton of Stanford University and Cong Yu of Google.

At the conference, Naeemul Hassan explained how the UT-Arlington computer scientists used machine learning to determine the attributes of a factual claim.
At the conference, Naeemul Hassan explained how the UT-Arlington computer scientists used machine learning to determine the attributes of a factual claim.

Adair spoke about the need for more research to achieve the “holy grail” of fully automated, instant fact-checking. Hassan gave a presentation about ClaimBuster, a tool that analyzes text and predicts which sentences are factual claims that fact-checkers might want to examine.

The Reporters’ Lab is working with computer scientists and researchers from UT-Arlington, Stanford and Google on the multi-year project to explore how computational power can assist fact-checkers.

Back to top