Duke lab gives fact-checkers, researchers new tools to thwart misinformation

Powerful global databases offer new insights into political claims and hoaxes, and help debunk manipulated images and video

By Erica Ryan – December 15, 2023 | Print this article

Researchers and journalists covering the battle against misinformation have a powerful new tool in their arsenal — a groundbreaking collection of fact-checking data that covers tens of thousands of harmful political claims and online hoaxes.

Fact-Check Insights, a comprehensive global database from the Duke Reporters’ Lab that launches this week, contains structured data from more than 180,000 claims from political figures and social media accounts that have been analyzed and rated by independent fact-checkers. The project was created with support from the Google News Initiative.

The Fact-Check Insights database is powered by ClaimReview — which has been called the world’s most successful structured journalism project — and its sibling MediaReview. The twin tagging systems allow fact-checkers to enter standardized data about their fact-checks, such as the statement being fact-checked, the speaker, the date, and the rating.

“ClaimReview and MediaReview are the secret sauce of fact-checking,” said Bill Adair, director of the Reporters’ Lab and the Knight professor of journalism and public policy at Duke. “This data will give researchers a much easier way to study how politicians lie, where false information spreads, and other vital topics so we can better combat misinformation.”

This rich, important dataset is updated daily, summarizing articles from dozens of fact-checkers around the world, including well-known organizations such as FactCheck.org, PesaCheck, Factly, Full Fact, Chequeado and Pagella Politica. It is ready-made for download in JSON and CSV formats. Access is free for researchers, journalists, technologists and others in the field, but registration is required.

Archiving tool for fact-checkers

Along with Fact-Check Insights, the Reporters’ Lab is launching MediaVault, a unique and unprecedented tool for fact-checkers who are working to debunk manipulated images and videos shared around the world.

MediaVault is a cutting-edge system that collects and stores images and videos that have been analyzed by reputable fact-checking organizations. The MediaVault archive allows fact-checkers to maintain a vital portion of their work, which would otherwise disappear when posts are removed from social media platforms. It also enables quicker research and identification of previously published images and videos in misleading social media posts.

“MediaVault is the first archival system that is specifically tailored to the needs of fact-checkers,” Adair said. “Our team developed this system after seeing too many posts go missing after being fact-checked. We realized fact-checkers needed a custom-made solution.”

MediaVault is free for use by fact-checkers, journalists and others working to debunk misinformation shared online, but registration is required. This project also received support from the Google News Initiative.

The Reporters’ Lab team behind the Fact-Check Insights and MediaVault projects includes lead technologist Christopher Guess, project manager Erica Ryan, ClaimReview/MediaReview manager Joel Luther, and Lab co-director Mark Stencel. The team was assisted by Duke University researcher Asa Royal, along with developer Justin Reese and designer Joanna Fonte of Bad Idea Factory.

Back to top

MediaReview: A next step in solving the misinformation crisis

An update on what we’ve learned from 1,156 entries of MediaReview, our latest collaboration to combat misinformation.

By Joel Luther – June 2, 2022 | Print this article

When a 2019 video went viral after being edited to make House Speaker Nancy Pelosi look inebriated, it took 32 hours for one of Facebook’s independent fact-checking partners to rate the clip false. By then, the video had amassed 2.2 million views, 45,000 shares, and 23,000 comments – many of them calling her “drunk” or “a babbling mess.”

The year before, the Trump White House circulated a video that was edited to make CNN’s Jim Acosta appear to aggressively react to a mic-wielding intern during a presidential press conference.

A string of high-profile misleading videos like these in the run-up to the 2020 U.S. election stoked long-feared concerns about skillfully manipulated videos, sometimes using AI. The main worry then was how fast these doctored videos would become the next battleground in a global war against misinformation. But new research by the Duke Reporters’ Lab and a group of participating fact-checking organizations in 22 countries found that other, far less sophisticated forms of media manipulation were much more prevalent.

By using a unified tagging system called MediaReview, the Reporters’ Lab and 43 fact-checking partners collected and categorized more than 1,000 fact-checks based on manipulated media content. Those accumulated fact-checks revealed that:

  • While we began this process in 2019 expecting deepfakes and other sophisticated media manipulation tactics to be the most imminent threat, we’ve predominantly seen low-budget “cheap fakes.” The vast majority of media-based misinformation is rated “Missing Context,” or, as we’ve defined it, “presenting unaltered media in an inaccurate manner.” In total, fact-checkers have applied the Missing Context rating to 56% of the MediaReview entries they’ve created.
  • Most of the fact-checks in our dataset, 78%, come from content on Meta’s platforms Facebook and Instagram, likely driven by the company’s well-funded Third-Party Fact Checking-Program. These platforms are also more likely to label or remove fact-checked content. More than 80% of fact-checked posts on Instagram and Facebook are either labeled to add context or no longer on the platform. In contrast, more than 60% of fact-checked posts on YouTube and Twitter remain intact, without labeling to indicate their accuracy.
  • Without reliable tools for archiving manipulated material that is removed or deleted, it is challenging for fact-checkers to track trends and bad actors. Fact-checkers used a variety of tools, such as the Internet Archive’s Wayback Machine, to attempt to capture this ephemeral misinformation; but only 67% of submitted archive links were viewable on the chosen archive when accessed at a later date, while 33% were not.

The Reporters’ Lab research also demonstrated MediaReview’s potential — especially based on the willingness and enthusiastic participation of the fact-checking community. With the right incentives for participating fact-checkers, MediaReview provides efficient new ways to help intercept manipulated media content — in large part because so many variations of the same claims appear repeatedly around the world, as the pandemic has continuously demonstrated.

The Reporters’ Lab began developing the MediaReview tagging system around the time of the Pelosi video, when Google and Facebook separately asked the Duke team to explore possible tools to fight the looming media misinformation crisis.

MediaReview is a sibling to ClaimReview, an initiative the Reporters’ Lab led starting in 2015, that sought to create infrastructure for fact-checkers to make their articles machine-readable and easily used for search engines, mobile apps, and other projects. Called “one of the most successful ‘structured journalism’ projects ever launched,” the ClaimReview schema has proven immensely valuable. Used by 177 fact-checking organizations around the world, ClaimReview has been used to tag 136,744 articles, establishing a large and valuable corpus of fact-checks: tens of thousands of statements from politicians and social media accounts around the world analyzed and rated by independent journalists. 

But ClaimReview proved insufficient to address the new, specific challenges presented by misinformation spread through multimedia. Thus, in September 2019, the Duke Reporters’ Lab began working with the major search engines, social media services, fact-checkers and other interested stakeholders on an open process to develop MediaReview, a new sibling of ClaimReview that creates a standard for manipulated video and images. Throughout pre-launch testing phases, 43 fact-checking outlets have used MediaReview to tag 1,156 images and videos, again providing valuable, structured information about whether pieces of content are legitimate and how they may have been manipulated.

In an age of misinformation, MediaReview, like ClaimReview before it, offers something vital: real-time data on which pieces of media are truthful and which ones are not, as verified by the world’s fact-checking journalists. 

But the work of MediaReview is not done. New fact-checkers must be brought on board in order to reflect the diversity and global reach of the fact-checking community, the major search and social media services must incentivize the creation and proper use of MediaReview, and more of those tech platforms and other researchers need to learn about, and make full use of, the opportunities this new tagging system can provide.

An Open Process

MediaReview is the product of a two-year international effort to get input from the fact-checking community and other stakeholders. It was first adapted from a guide to manipulated video published by The Washington Post, which was initially presented at a Duke Tech & Check meeting in the spring of 2019. The Reporters’ Lab worked with Facebook, Google, YouTube, Schema.org, the International Fact-Checking Network, and The Washington Post to expand this guide to include a similar taxonomy for manipulated images. 

The global fact-checking community has been intimately involved in the process of developing MediaReview. Since the beginning of the process, the Reporters’ Lab has shared all working drafts with fact-checkers and has solicited feedback and comments at every step. We and our partners have also presented to the fact-checking community several times, including at the Trusted Media Summit in 2019, a fact-checkers’ community meeting in 2020, Global Fact 7 in 2020, Global Fact 8 in 2021 and several open “office hours” sessions with the sole intent of gathering feedback.

Throughout development and testing, the Reporters’ Lab held extensive technical discussions with Schema.org to properly validate the proposed structure and terminology of MediaReview, and solicited additional feedback from third-party organizations working in similar spaces, including the Partnership on AI, Witness, Meedan and Storyful.

Analysis of the First 1,156

As of February 1, 2022, fact-checkers from 43 outlets spanning 22 countries have now made 1,156 MediaReview entries.

Number of outlets creating MediaReview by country.

Number of MediaReview entries created by outlet.

Our biggest lesson in reviewing these entries: The way misinformation is conveyed most often through multimedia is not what we expected. We began this process in 2019 expecting deepfakes and other sophisticated media manipulation tactics to be an imminent threat, but we’ve predominantly seen low-budget “cheap fakes.” What we’ve seen consistently throughout testing is that the vast majority of media-based misinformation is rated “Missing Context,” or, as we’ve defined it, “presenting unaltered media in an inaccurate manner.” In total, fact-checkers have applied the Missing Context rating to 56% of the MediaReview entries they’ve created.

The “Original” rating has been the second most applied, accounting for 20% of the MediaReview entries created. As we’ve heard from fact-checkers through our open feedback process, a substantial portion of the media being fact-checked is not manipulated at all; rather, it consists of original videos of people making false claims. Going forward, we know we need to be clear about the use of the “Original” rating as we help more fact-checkers get started with MediaReview, and we need to continue to emphasize the use of ClaimReview to counter the false claims contained in these kinds of videos.

Throughout the testing process, the Duke Reporters’ Lab has monitored incoming MediaReview entries and provided feedback to fact-checkers where applicable. We’ve heard from fact-checkers that that feedback was valuable and helped clarify the rating system. 

Reviewing media links that have been checked by third-party fact-checkers, a vast majority of fact-checked media thus far exists on Facebook:

Share of links in the MediaReview dataset by platform.

Facebook’s well-funded Third Party Fact-Checking Program likely contributes to this rate; fact-checkers are paid directly to check content on Facebook’s platforms, making that content more prevalent in our dataset.

We also reviewed the current status of links checked by fact-checkers and tagged with MediaReview. With different platforms having different policies on how they deal with misinformation, some of the original posts are intact, others have been removed by either the platform or the user, and some have a context label appended with additional fact-check information. By platform, Instagram is the most likely to append additional information, while YouTube is the most likely to present fact-checked content in its original, intact form, not annotated with any fact-checking information: 72.5% of the media checked from YouTube are still available in their original format on the platform.

Status of fact-checked media broken down by platform, showing the percentage of checked media either labeled with additional context, removed, or presented fully intact.

In addition, we noted that fact-checkers have often (roughly 25% of the time) input an archival link into the “Media URL” field, in an attempt to capture the link for the video or image, ephemeral misinformation that is often quickly deleted by either the platforms or the users. Notably, though, these existing archive systems are unreliable; only 67% of submitted archive links were viewable on the archive, while 33% were not. While we found that Perma.cc was the most reliable existing archiving system used by fact-checkers, it only successfully presented 80% of checked media, and its status as a paid archival tool leaves an opportunity to build a new system to preserve fact-checked media.

Success rate of archival tools used by fact-checkers in properly displaying the fact-checked media.

Next Steps

Putting MediaReview to use: Fact-checkers have emphasized to us the need for social media companies and search engines platforms to make use of these new signals. They’ve highlighted that usability testing would help ensure that MediaReview data was seen prominently on the tech platforms. 

Archiving the images and videos: As noted above, current archiving systems are insufficient to capture the media misinformation fact-checkers are reporting on. Currently, fact-checkers using MediaReview are limited to quoting or describing the video or image they checked and including the URL where they discovered it. There’s no easy, consistent workflow for preserving the content itself. Manipulated images and videos are often removed by social media platforms or deleted or altered by their owners, leaving no record of how they were manipulated or presented out of context. In addition, if the same video or image emerges again in the future, it can be difficult to determine if it has been previously fact-checked. A repository of this content — which could be saved automatically as part of each MediaReview submission — would allow for accessibility and long-term durability for archiving, research, and more rapid detection of misleading images and video. 

Making more: We continue to believe that fact-checkers need incentives to continue making this data. The more fact-checkers use these schemas, the more we increase our understanding of the patterns and spread of misinformation around the world — and the ability to intercept inaccurate and sometimes dangerous content. The effort required to produce ClaimReview or MediaReview is relatively low, but adds up cumulatively — especially for smaller teams with limited technological resources. 

While fact-checkers created the first 1,156 entries solely to help the community refine and test the schema, further use by the fact-checkers must be encouraged by the tech platforms’ willingness to adopt and utilize the data. Currently, 31% of the links in our MediaReview dataset are still fully intact where they were first posted; they have not been removed or had any additional context added. Fact-checkers have displayed their eagerness to research manipulated media, publish detailed articles assessing their veracity, and make their assessments available to the platforms to help curb the tide of misinformation. Search engines and social media companies must now decide to use and display these signals.

Appendix: MediaReview Development Timeline

MediaReview is the product of a two-year international effort involving the Duke Reporters’ Lab, the fact-checking community, the tech platforms and other stakeholders. 

Mar 28, 2019

Phoebe Connelly and Nadine Ajaka of The Washington Post first presented their idea for a taxonomy classifying manipulated video at a Duke Tech & Check meeting. 

Sep 17, 2019

The Reporters’ Lab met with Facebook, Google, YouTube, Schema.org, the International Fact-Checking Network, and The Washington Post in New York to plan to expand this guide to include a similar taxonomy for manipulated images. 

Oct 17, 2019

The Reporters’ Lab emailed a first draft of the new taxonomy to all signatories of the IFCN’s Code of Principles and asked for comments.

Nov 26, 2019

After incorporating suggestions from the first draft document and generating a proposal for Schema.org, we began to test MediaReview for a selection of fact-checks of images and videos. Our internal testing helped refine the draft of the Schema proposal, and we shared an updated version with IFCN signatories on November 26.

Jan 30, 2020

The Duke Reporters’ Lab, IFCN and Google hosted a Fact-Checkers Community Meeting at the offices of The Washington Post. Forty-six people, representing 21 fact-checking outlets and 15 countries, attended. We presented slides about MediaReview, asked fact-checkers to test the creation process on their own, and again asked for feedback from those in attendance.

Apr 16, 2020

The Reporters’ Lab began a testing process with three of the most prominent fact-checkers in the United States: FactCheck.org, PolitiFact, and The Washington Post. We have publicly shared their test MediaReview entries, now totaling 421, throughout the testing process.

Jun 1, 2020

We wrote and circulated a document summarizing the remaining development issues with MediaReview, including new issues we had discovered through our first phase of testing. We also proposed new Media Types for “image macro” and “audio,” and new associated ratings, and circulated those in a document as well. We published links to both of these documents on the Reporters’ Lab site (We want your feedback on the MediaReview tagging system) and published a short explainer detailing the basics of MediaReview (What is MediaReview?)

Jun 23, 2020

We again presented on MediaReview at Global Fact 7 in June 2020, detailing our efforts so far and again asking for feedback on our new proposed media types and ratings and our Feedback and Discussion document. The YouTube video of that session has been viewed over 500 times, by fact-checkers around the globe, and dozens participated in the live chat. 

Apr 1, 2021

We hosted another session on MediaReview for IFCN signatories on April 1, 2021, again seeking feedback and updating fact-checkers on our plans to further test the Schema proposal.

Jun 3, 2021

In June 2021, the Reporters’ Lab worked with Google to add MediaReview fields to the Fact Check Markup Tool and expand testing to a global userbase. We regularly monitored MediaReview and maintained regular communication with fact-checkers who were testing the new schema.

Nov 10, 2021

We held an open feedback session with fact-checkers on November 10, 2021, providing the community another chance to refine the schema. Overall, fact-checkers have told us that they’re pleased with the process of creating MediaReview and that its similarity to ClaimReview makes it easy to use. As of February 1, 2022, fact-checkers have made a total of 1,156 MediaReview entries. 

For more information about MediaReview, contact Joel Luther.

Back to top

Image shows logo for Global Fact 8.

Reporters’ Lab Takes Part in Eighth ‘Global Fact’ Summit

The Reporters’ Lab team participated in five conference sessions and hosted a daily virtual networking table at the conference with more than 1,000 attendees.

By Joel Luther – November 8, 2021 | Print this article

The Duke Reporters’ Lab spent this year’s eighth Global Fact conference helping the world’s fact-checkers learn more about tagging systems that can extend the reach of their work; encouraging a sense of community among organizations around the globe; and discussing new research that offers potent insights into how fact-checkers do their jobs.

This year’s Global Fact took place virtually for the second time, following years of meeting in person all around the world, in cities such as London, Buenos Aires, Madrid, Rome, and Cape Town. More than 1,000 fact-checkers, academic researchers, industry experts, and representatives from technology companies attended the virtual conference.

Over three days, the Reporters’ Lab team participated in five conference sessions and hosted a daily virtual networking table.

  • Reporters’ Lab director and IFCN co-founder Bill Adair delivered opening remarks for the conference, focused on how fact-checkers around the world have closely collaborated in recent years.
  • Mark Stencel, co-director of the Reporters’ Lab, moderated the featured talk with Tom Rosenstiel, the Eleanor Merrill Visiting Professor on the Future of Journalism at the Philip Merrill College of Journalism at the University of Maryland and coauthor of The Elements of Journalism. Rosenstiel previously served as executive director of the American Press Institute. He discussed research into how the public responds to the core values of journalism and how fact-checkers might be able to build more trust with their audience.
  • Thomas Van Damme presented findings from his master’s thesis, “Global Trends in Fact-Checking: A Data-Driven Analysis of ClaimReview,” during a panel discussion moderated by Lucas Graves and featuring Joel Luther of the Reporters’ Lab and Karen Rebelo, a fact-checker from BOOM in India. Van Damme’s analysis reveals fascinating trends from five years of ClaimReview data and demonstrates ClaimReview’s usefulness for academic research.
  • Luther also prepared two pre-recorded webinars that were available throughout the conference:

In addition, the Reporters’ Lab is excited to reconnect with fact-checkers again at 8 a.m. Eastern on Wednesday, November 10, for a feedback session on MediaReview. We’re pleased to report that fact-checkers have now used MediaReview to tag their fact-checks of images and videos 841 times, and we’re eager to hear any additional feedback and continue the open development process we began in 2019 in close collaboration with the IFCN.

Back to top

What 2018 midterm campaign ads and Christmas cookies have in common

While reviewing thousands of political claims during midterm 2018 campaigns, it was hard to miss streams of look-alike messages

By Sydney McKinney & Alex Johnson – December 5, 2018 | Print this article

Same ad, different name, over and over again. Cookie-cutter ads, generic political ads used to promote or criticize multiple campaigns and candidates, were widely deployed during the 2018 North Carolina midterm elections.

As student journalists working on the North Carolina Fact-Checking Project, we spent  months sifting through thousands of campaign ads looking for political claims to fact-check. It didn’t take long to notice that many were nearly identical.

Sophomore Sydney McKinney

The copy-cat ads we encountered typically targeted groups of candidates, such as state House candidates from one party, and added their names to the same attack ad. That allowed  the opposing political party and their boosters to widely circulate messages about topics important to their base.

One reason for this is state political campaigns have become increasingly centralized in recent years, often run by political caucuses rather than individual candidates, said Gary Pearce, co-publisher of Talking About Politics, a blog about North Carolina and national politics.

Congressional campaign committees in Washington, D.C. as well as North Carolina legislative caucus committees conduct voter research and use the data to pinpoint issues that matter most to target voters during election season, he said.  

“Consistency amplifies the message,” Pearce said. “It makes sense for the caucuses to take on a specific set of issues that are important in this election and will rile the voters up.”

The N.C. Democratic Party used the carbon-copy ads to denounce lots of GOP candidates at once.

The North Carolina Democratic Party employed this technique often this year, producing ads that claimed Republicans would eliminate insurance coverage for pre-existing medical conditions, ignore polluted drinking water, even tolerate corruption within the state Republican Party.

Political Action Committees, such as the conservative North Carolina Values Coalition, employed a different strategy, also based on focused messaging. They published a series of same-design ads endorsing 13 North Carolina House and Senate candidates. They cited the same reasoning every time: the candidates supported “pro-life, pro-religious liberty, and pro-family public policy.”

The N.C. Values Coalition PAC used look-alike ads to promote candidates in line with its priorities.

“We aim to use a language that appeals to our coalition members, and creates brand familiarity,” said Jim Quick, the group’s media and communications director. “We want to show that we are laser focused on certain issues through repetition.”

Angie Holan, editor of the national fact-checking website PolitiFact, said such ads remain an inexpensive way to disseminate information. Despite this age of targeting marketing on the web and elsewhere, the persistence of this sort of marketing could be linked to U.S. voters’ increasing partisanship, she said.

Sophomore Alex Johnson

“We’re not seeing a lot of crossover or, frankly, a lot of complexity or nuance in most of the public policy positions politicians are taking. So that makes it very easy to do cookie cutter ads,” Holan said.

Colin Campbell, a North Carolina political reporter and columnist with The Insider, recently argued that the cookie cutter ads “dreamed up by young staffers sitting in a Raleigh office” may have hurt candidates in both parties during the 2018 campaign season.

For Democrats to win rural districts and Republicans to win urban districts, candidates need to switch their focus to local issues that people from all parties care about, Campbell argued. He pointed to State Rep. Ken Goodman, a Democrat who this fall won re-election in District 66, west of Fayetteville.

Goodman’s ads focused on increasing the amount of lottery money that goes towards public education, not an issue on the national or statewide Democratic agenda, Campbell noted. The moderate Democrat won re-election in a rural district, which required him to gain wide support.

Which way will political campaigns lean in the presidential election year 2020? Unknown. But student journalists in the Duke Reporters’ Lab will be watching.

Back to top

Lessons learned from fact-checking 2018 midterm campaigns

After monitoring political messaging, students see the need for accountability journalism more than ever

By Catherine Clabby – November 20, 2018 | Print this article

Five Duke undergraduates monitored thousands of political claims this semester during a heated midterm campaign season for the N.C. Fact-Checking Project.

That work helped expand nonpartisan political coverage in a politically divided state with lots of contested races for state and federal seats this fall. The effort resumes in January when the project turns its attention to a newly configured North Carolina General Assembly.

Three student journalists who tackled this work with fellow sophomores Alex Johnson and Sydney McKinney reflect on what they’ve learned so far.

Lizzie Bond

Lizzie Bond: After spending the summer working in two congressional offices on Capitol Hill, I began my work in the Reporters’ Lab and on the N.C. Fact-Checking Project with first-hand knowledge of how carefully elected officials and their staff craft statements in press releases and on social media. This practice derives from a fear of distorting the meaning or connotation of their words. And in this social media age where so many outlets are available for sharing information and for people to consume it, this fear runs deep.

Yet, it took me discovering one candidate for my perspective to shift on the value of our work with the N.C. Fact-Checking Project. That candidate, Peter Boykin, proved to be a much more complicated figure than any other politician whose social media we monitored. The Republican running to represent Greensboro’s District 58 in the General Assembly, Boykin is the founder of “Gays for Trump,” a former online pornography actor, a Pro-Trump radio show host, and an already controversial, far-right online figure with tens of thousands of followers. Pouring through Boykin’s nearly dozen social media accounts, I came across everything from innocuous self-recorded music video covers to contentious content, like hostile characterizations of liberals and advocacy of conspiracy theories, like one regarding the Las Vegas mass shooting which he pushed with little to no corroborating evidence.

When contrasting Boykin’s posts on both his personal and campaign social media accounts with the more cautious and mild statements from other North Carolina candidates, I realized that catching untruthful claims has a more ambitious goal that simply detecting and reporting falsehoods. By reminding politicians that they should be accountable to the facts in the first place, fact-checking strives to improve their commitment to truth-telling. The push away from truth and decency in our politics and toward sharp antagonism and even alternate realities becomes normalized when Republican leaders support candidates like Boykin as simply another GOP candidate. The N.C. Fact-Checking Project is helping to revive truth and decency in North Carolina’s politics and to challenge the conspiracy theories and pants-on-fire campaign claims that threaten the self-regulating, healthy political society we seek.

Ryan Williams

Ryan Williams: I came into the Reporters’ Lab with relatively little journalism experience. I spent the past summer working on social media outreach & strategy at a non-profit where I drafted tweets and wrote the occasional blog post. But I’d never tuned into writing with the immense brevity of political messages during an election season. The N.C. Fact-Checking Project showed me the importance of people who not only find the facts are but who report them in a nonpartisan, objective manner that is accessible to an average person.

Following the 2016 election, some people blamed journalists and pollsters for creating false expectations about who would win the presidency. I was one of those critics. In the two and a half months I spent fact-checking North Carolina’s midterm races, I learned how hard fact-checkers and reporters work. My fellow fact-checkers and I compiled a litany of checkable claims made by politicians this midterm cycle. Those claims, along with claims found by the automated claim-finding algorithm ClaimBuster were raw material for many fact-checks of some of North Carolina hottest races. Those checks were made available for voters ahead of polling.

Now that election day has come and gone, I am more than grateful for this experience in fact-finding and truth-reporting. Not only was I able to hone research skills, I gained a deeper understanding of the intricacies of political journalism. I can’t wait to see what claims come out of the next two years leading up to, what could be, the presidential race of my lifetime.

Jake Sheridan

Jake Sheridan: I’m a Carolina boy who has grown up on the state’s politics. I’ve worked on campaigns, went to the 2012 Democratic National Committee in my hometown of Charlotte and am the son of a long-time news reporter. I thought I knew North Carolina politics before working in the Reporter’s Lab. I was wrong.

While trying to wrap my head around the 300-plus N.C. races, I came to better understand the politics of this state. What matters in the foothills of the Piedmont, I found out, is different than what matters on the Outer Banks and in Asheville. I discovered that campaigns publicly release b-roll so that PACs can create ads for them and saw just how brutal attack ads can be. I got familiar with flooding and hog farms, strange politicians and bold campaign claims.

There was no shortage of checkable claims. That was good for me. But it’s bad for us. I trust politicians less now. The ease with which some N.C. politicians make up facts troubles me. Throughout this campaign season in North Carolina, many politicians lied, misled and told half truths. If we want democracy to work — if we want people to vote based on what is real so that they can pursue what is best for themselves and our country — we must give them truth. Fact-checking is essential to creating that truth. It has the potential to place an expectation of explanation upon politicians making claims. That’s critical for America if we want to live in a country in which our government represents our true best interests and not our best interests in an alternate reality.

 

Back to top

Catherine Clabby joins the Duke Reporters’ Lab

The veteran journalist will manage student research projects, including the Tech & Check Cooperative.

By Bill Adair – July 30, 2018 | Print this article

Catherine Clabby, an award-winning reporter and editor, has been named the new research and communications manager in the Duke Reporters’ Lab. In that role, Clabby will help direct student research on political fact-checking and automated journalism, including the Tech & Check Cooperative.

In addition to her work in the Lab, Clabby will teach Newswriting and Reporting (PJMS 367), a core course in the journalism program in the DeWitt Wallace Center for Media & Democracy.

Clabby is a veteran journalist who most recently covered environmental health topics for the North Carolina Health News. Before that, she was the senior editor of the E.O. Wilson Life on Earth biology book series and a senior editor at American Scientist magazine.

From 1994 to 2007, she was a reporter at the Raleigh News & Observer where she covered science, medicine and a variety of state and local topics, including a U.S. Senate race. She left the paper in 2007 to take a year-long Knight Science Journalism fellowship at MIT.

Clabby lives in Durham with her husband, Christoph Guttentag, Duke’s dean of undergraduate admissions. Their daughter is a college student in Massachusetts.

 

Back to top

Rebecca Iannucci

[PHOTOS] The Reporters’ Lab takes on Global Fact 4 in Madrid

Six team members from the Lab traveled to Spain for the annual summit of fact-checkers around the world

By Rebecca Iannucci – July 14, 2017 | Print this article

The Reporters’ Lab team recently spent five days in Spain, exploring the future of fact-checking — but we left plenty of time for churros, chocolate and an unusual fish concoction called Gulas.

Six team members from the Lab — co-directors Bill Adair and Mark Stencel, project manager Rebecca Iannucci, student researcher Riley Griffin, Share the Facts project manager Erica Ryan and developer Chris Guess — traveled to Madrid July 4-9 for Global Fact 4, the annual gathering of the world’s fact-checkers.

But even though the trip was primarily for business, there were ample opportunities to explore and enjoy the city. Among the highlights: a trip to El Museo Nacional Centro de Arte Reina Sofía, home to Picasso’s Guernica; a taste of Basque tapas at the restaurant Txapela; and plenty of people-watching at El Mercado de San Miguel. (Oh, and did we mention the churros?)

Below, scroll through assorted scenes from Madrid, then click here for more coverage of Global Fact 4.

(L-R) Mark Stencel, Rebecca Iannucci and Riley Griffin enjoy churros at Chocolatería San Ginés.
(L-R) Bill Adair, Rebecca Iannucci and Erica Ryan get some work done at Campus Madrid.
Rebecca Iannucci presents the Reporters’ Lab’s FactPopUp tool to the Global Fact 4 audience. Photo credit: Mario Garcia.
Rebecca Iannucci presents the Reporters’ Lab’s FactPopUp tool to the Global Fact 4 audience. Photo credit: Mario Garcia.
Global Fact 4 boasted 188 attendees from 53 countries. Photo credit: Mario Garcia.
Rebecca Iannucci poses in front of Campus Madrid’s signage.
Bill Adair leads a standing ovation for Alexios Mantzarlis, director of the International Fact-Checking Network and organizer of Global Fact 4. Photo credit: Mario Garcia.
Rebecca Iannucci tries Gulas, a shredded fish dish, at El Mercado de San Miguel.
(L-R) Rising Duke senior Alex Newhouse, Riley Griffin, Erica Ryan and Rebecca Iannucci, after lunch in La Plaza Mayor.
(L-R) Riley Griffin, Bill Adair, Erica Ryan and Rebecca Iannucci, after lunch in La Plaza Mayor.
Back to top

Students selected for research work at Duke Reporters’ Lab

Eight undergraduates will assist with news experiments and help explore the future of journalism.

By Mark Stencel – September 14, 2015 | Print this article

Student researchers play leading roles at the Duke Reporters’ Lab, experimenting with new forms of storytelling and exploring the state of newsroom innovation.

With the start of a new academic year, a team of eight students are donning white lab coats to help us map the future of journalism. Their involvement is one of the things that makes the Lab such a lively place (especially for this Duke newcomer).

These students will investigate ways to create new “structured” story forms that allow journalists to present information in engaging, digital-friendly ways. They also will track and help foster the work of political fact-checkers that are holding politicians around the world accountable for their statements and their promises.

We’ve just completed hiring our 2015-2016 team:

Natalie Ritchie: Over the summer, Natalie was a reporter for Structured Stories NYC — the Reporters’ Lab effort to test a new storytelling tool in the wilds of New York politics. She is co-editor in chief of the Duke Political Review. A public policy senior with a focus on international affairs, Natalie previously interned with the Senate Foreign Relations Committee, worked as a student communications assistant for the Duke Global Health Institute, and taught English to Iraqi, Palestinian, and Syrian refugees in Jordan. In addition, she interned for Republican Sen. Bob Corker of Tennessee, her home state.

Ryan Hoerger: The sports editor of The Chronicle, Duke’s student newspaper, is a senior from California double-majoring in public policy and economics. Last summer Ryan covered financial markets as an intern for Bloomberg. Before that, he interned for Duke magazine and conducted policy research during a summer stint at FasterCures. He is currently finishing up an undergraduate honors thesis that examines federal incentives for pharmaceutical research and development.

Shannon Beckham: Shannon, a public policy senior from Arizona, has seen how political fact-checking works from both sides of the process, having interned in the White House speechwriting office and at PolitiFact, the Pulitzer-winning service run by the Tampa Bay Times. She worked for the Chequeado fact-checking site in Buenos Aires, where she assisted with a 2014 meeting of Latin American fact-checkers. At the Reporters Lab, she helped start our database of fact-checking sites and organize the first Global Fact-Checking Summit last year in London.

Gautam Hathi: A junior in computer science who grew up near the Microsoft campus in Redmond, Wash., Gautam is already working at the intersection of news and technology. Having interned for Google and 3Sharp, the computer science major is now the digital content director for The Chronicle at Duke. He previously was The Chronicle’s health and science editor and is a contributing editor for the Duke Political Review.

Shaker Samman: Shaker is a public policy junior from Michigan. At the Reporters’ Lab, he worked on fact-checking and structured journalism prototypes and co-authored a PolitifFact story on the North Carolina Senate race with Lab co-director Bill Adair. He has interned as a reporter for the Tampa Bay Times in Florida and The Times Herald in Port Huron, Mich., where he also worked on his high school radio station.

Claire Ballentine: Claire is head of the university news department at The Chronicle. She began working for the Lab last year, helping update our database of political fact-checkers. The sophomore from Tennessee also has blogged for Her Campus and worked as an editing intern for the Great Smoky Mountains National Park Association. She was the editor-in-chief of her high school yearbook.

Jillian Apel: Jill brings an eye for visual storytelling to the Lab. A sophomore from California with a passion for writing as well, she was the managing editor of the student newspaper at the Brentwood School in Los Angeles.

Julia Donheiser: Julia’s data savvy comes via a social science research project she started as a student at the Bronx High School of Science. With guidance from a pair of educational psychologists, she crunched statewide numbers from school districts across New York to investigate the effects of various social factors on diagnosis rates for autism and learning disabilities. Now a freshman at Duke, she worked on the student newspaper at her high school. She also wrote a food blog that will make you hungry.

Back to top