Press "Enter" to skip to content

Tag: PolitiFact

Fact-checking Trump’s speech with the Share the Facts widget

Republican nominee Donald Trump’s 75-minute acceptance speech on the last night of the GOP convention sent fact-checkers into overdrive.

PolitiFact, The Washington Post and FactCheck.org all produced roundups of their research into dozens of Trump’s claims. Here’s a look at four of those claims and the resulting fact-checks, which you can share using the Share the Facts widget.

The widget was created by the Duke Reporters’ Lab and Jigsaw, a technology incubator within Alphabet, the parent company of Google. We encourage you to use the widget to share fact-checks on Facebook and Twitter, or even embed them in articles and blog posts.

1. “Household incomes are down more than $4,000 since the year 2000.”

Here’s how the three fact-checking organizations currently using the Share the Facts widget weighed in on this Trump claim. Click “Read More” on each widget to see the facts behind their conclusions.

Share the Facts Widget Embed
Share the Facts Widget Embed
Share the Facts Widget Embed

 

2. “America is one of the highest-taxed nations in the world.”

According to the fact-checkers, this claim from Trump had serious problems.

Share the Facts Widget Embed
Share the Facts Widget Embed
Share the Facts Widget Embed

 

3. A “550 percentage increase in Syrian … refugees … [Democrat Hillary Clinton] proposes this despite the fact that there’s no way to screen these refugees in order to find out who they are or where they come from.”

While Clinton has proposed allowing as many as 65,000 Syrian refugees into the U.S., fact-checkers find Trump’s claim that “there’s no way to screen” is not true.

Share the Facts Widget Embed
Share the Facts Widget Embed
Share the Facts Widget Embed

 

4. “Decades of progress made in bringing down crime are now being reversed by this administration’s rollback of criminal enforcement. Homicides last year increased by 17 percent in America’s 50 largest cities. That’s the largest increase in 25 years.”

While the fact-checkers note that Trump has a credible source for his numbers (The Post, in fact), they find he’s guilty of cherry-picking data to give the impression of a scary trend.

Share the Facts Widget Embed
Share the Facts Widget Embed
Share the Facts Widget Embed

Want to embed fact-checks like this in your articles and blog posts? Contact us for the easy instructions.

Comments closed

Six Trump claims you’re likely to hear in Cleveland

Politicians love talking points. The scripted lines provide consistency for campaign messages and quotes that are often irresistible to journalists. Talking points are used repeatedly, even by a candidate like Donald Trump who is known to stray from his script.

With the Republican National Convention about to start, we thought it would be helpful to show some of the stock lines we expect to hear and how the nation’s fact-checkers have judged their accuracy. It’s also an opportunity for us to showcase the Share the Facts widget, our new tool that summarizes fact-checks.

For the past several months, PolitiFact, The Washington Post and FactCheck.org have been using the widget, which was created by the Duke Reporters’ Lab and Jigsaw, a technology incubator within Alphabet, the parent company of Google.

The three fact-checking outlets have already created more than 1,000 widgets, mostly from the 2016 presidential campaign.

We encourage you to share the facts by posting the widgets on Facebook and Twitter, or even embedding them in articles or blog posts.

On Benghazi

According to The New York Times, the first night of the convention is set to focus on the attack in Benghazi, Libya, that killed four Americans during Democrat Hillary Clinton’s tenure as secretary of state. Here’s a look at one Trump claim that FactCheck.org found didn’t hold up to scrutiny.

Share the Facts Widget Embed

On Immigration

Another issue expected to be in the spotlight on Monday night is immigration – an especially hot topic for Trump, who has proposed “a total and complete shutdown of Muslims entering the United States until our country’s representatives can figure out what is going on.” He says the president has the authority to do it, and The Washington Post’s Fact Checker found he’s largely correct.

Share the Facts Widget Embed

The Post gave him one Pinocchio because the president does have “broad powers to deny admission of people or groups into the United States. But the power has not been tested in the way that Trump proposes.”

Trump earned four Pinocchios from the Post for a claim tying crime to immigration.

Share the Facts Widget Embed

On the Economy

The second night of the convention is scheduled to have an economic theme, so we expect to hear claims about taxes and trade. FactCheck.org has noted that Trump is fond of repeating that American taxpayers pay more than residents of other countries – which it found isn’t true (though the U.S. business tax rate does rank among the highest in the world).

Share the Facts Widget Embed

Trump also speaks frequently about the U.S. trade deficit with China, and he’s accused Clinton of making it worse. But PolitiFact found he’s assigning the blame in the wrong place, since the secretary of state has a small role in trade policy.

Share the Facts Widget Embed

On His Bid for the Nomination

Keeping with tradition, Trump is expected to speak on the last night of the convention – a speech that is sure to produce many claims for fact-checkers to examine. As he accepts the party’s nomination, he may repeat an assertion about his vote totals in the primary elections that PolitiFact found is mostly true.

Share the Facts Widget Embed

Stay tuned throughout the Republican convention for more opportunities to share the facts.

Want to embed fact-checks like this in your articles and blog posts? Contact us for the easy instructions.

Comments closed

At GlobalFact 3, a call for transparency and impartial fact-checking

My opening remarks at GlobalFact 3, the third annual meeting of the world’s fact-checkers, oragnized by Poynter’s International Fact-Checking Network and the Reporters’ Lab, held June 9-10, 2016 in Buenos Aires, Argentina.

It’s amazing how our group has grown. Our latest tally in the Duke Reporters’ Lab is 105 active sites around the world, which is up more than 60 percent from last year.

We’ve also seen marvelous growth in international collaborations. Alexios has organized some impressive check-a-thons for economic summits and other events, uniting more than a dozen fact-checkers for a single event. And a few months ago, Africa Check joined PolitiFact for an unprecedented partnership to check claims about global health and development.

Our fact-checks are increasingly having an impact. Politicians cite them in speeches and campaign commercials. One organization recently emailed its senior staff reminding them about the new Africa Check-PolitiFact project, cautioning them to be accurate in their statements. In Ireland, attention generated by a Journal.ie fact-check halted a viral social media campaign to “name and shame” Irish parliamentarians for their purportedly low attendance at a debate on mental health services.

More than 100 people attended the conference, which was held in Buenos Aires, Argentina.
More than 100 people attended the conference, which was held in Buenos Aires, Argentina.

Here in Argentina, Gabriella Michetti, vice-presidential candidate on the Macri ticket, was asked about a “Falso” she got from Chequeado. She replied, ”I saw that on Chequeado. Which is why we corrected ourselves and never repeated it.”

Our audiences are growing. In the United States, the three big fact-checkers are all reporting record-breaking traffic. A debate article by FactCheck.org got more than 1.8 million page views on the site and partners such as MSN.com.

In the United States, we have a presidential candidate named Donald Trump — perhaps you have heard of him — who has shown why fact-checking is so important. Some pundits have said his disregard for facts shows we live in a “post-fact” era when facts no longer matter. But I think it shows a more positive story: we know about Donald Trump’s falsehoods because of the tremendous work of a growing army of fact-checkers.

We’ve reached a point where fact-checking is no longer a novelty. It’s no longer something that we have to explain to the people we’re checking. It’s now a mature form of journalism — and an expected part of how news organizations cover political campaigns and government.

But now that fact-checking has matured, it’s time to make sure we push our journalism to the next level. To maintain our status as trusted sources, we need to make sure that our work is rock solid. Our fact-checks must be thoroughly researched using the most independent sources available. Our writing needs to be clear and concise.

We need to show that we do not play favorites. We need to be impartial and apply the same standards to everyone we check. And we need to check everyone.  As Rem Rieder wrote in USA Today in a column this week that mentioned our meeting, for fact-checking to work, “it has to be an equal opportunity endeavor, strictly nonpartisan.”

In the past year,  the students and colleagues who maintain our fact-checking database have come across a couple of sites that primarily check one party in their political system. That’s not fact-checking; that’s advocacy. To be a reputable fact-checker, you must check all the players in your political systems.

Fact-checkers also need to be transparent in our work. We need to explain how we choose statements to check and how our ratings work. We need to reveal our sources and be clear how we reached our conclusions.

We also need to be transparent about the funding and structure of our organizations. We need to explain who gives us money and reassure our readers and consumers that we are not political activists.

We also need to continue to expand our audiences. I continue to be surprised by the relatively limited use of fact-checking on television. We should seek more partnerships with TV networks and show them that the fact-checking makes great TV. You will love hearing from our keynote speaker, Natalia Hernández Rojo, who does some of the best TV fact-checks in the world for La Sexta’s El Objetivo in Spain. We can all learn a lot from Natalia.

Finally, I want to conclude with a suggestion. In catching up with many of you in the past couple of days I have realized that I have not done enough to follow your work. So I’m going to set a new goal to read one fact-check every day. I’ll randomly choose a site from our Reporters’ Lab database and read the most recent one.

I encourage you to do the same thing — a fact-check a day. It’s a new way that we can continue to build our community. By reading each other’s work, we can learn about each other and improve our work.

It’s a wonderful time to be in our movement. Fact-checking keeps growing and it has become a powerful force that informs democracies around the world. We need to maintain that momentum and make sure that our work is the best it can be.

Comments closed

New Share the Facts widget helps facts – rather than falsehoods – go viral

The Duke Reporters’ Lab is introducing Share the Facts, a widget that provides a new way for readers to share fact-check articles and spread them virally across the Internet.

The compact Share the Facts box summarizes the claim being checked and the fact-checker’s conclusion in a mobile-friendly format. The widgets have a consistent look but can be customized with the fact-checkers’ logos and ratings such as Pinocchios or the Truth-O-Meter. The standardization allows readers to recognize fact-checking whenever they come across it on the web and to post Share the Facts on social media and by embedding the boxes in articles and blog posts.

The widget summarizes fact-checks and allows readers to click to the original article.
The widget summarizes fact-checks and allows readers to click to the original article.

Fact-checkers can create Share the Facts boxes using a simple template developed by the Reporters’ Lab. The form generates the HTML of the box that can be pasted into content management systems or embedded in the same way as Tweets. Share the Facts boxes are also fully machine-readable, enabling new ways of assembling automated collections of fact-check findings from across the Internet. For example, someone could set up a page that compiles Share the Fact boxes from a single event or a particular candidate.

Share the Facts will be helpful to columnists and bloggers because they’ll be able to compile and display several boxes for a debate or a candidate the same way they embed tweets.

Share the Facts was developed by The Reporters’ Lab and Jigsaw, a technology incubator within Alphabet, the parent company of Google.

The widgets are customized with the logo of the fact-checking site.
The widgets are customized with the logo of the fact-checking site.

The widget has been tested in the past few weeks by The Washington Post, PolitiFact and FactCheck.org. The Reporters’ Lab has been incorporating feedback from those sites and will be making the widget available to other fact-checking sites this spring and summer.

“We are excited to participate in the Share the Facts project,” said Eugene Kiely, director of FactCheck.org. “It gives voters the ability to more easily share fact-checking stories and find fact-checking stories.”

Glenn Kessler, the editor and chief writer of The Washington Post’s Fact Checker, said it “will be a terrific tool for readers to share the results of our fact-checking. In this exciting, fact-challenged campaign year, I expect it will expand the reach and impact of our work.”

For articles from FactCheck.org and other sites that don't use rating systems, the widget can include a short text explaining the conclusion.
For articles from FactCheck.org and other sites that don’t use rating systems, the widget can include a short text explaining the conclusion.

Said Aaron Sharockman, the executive director of PolitiFact: “Share the Facts is part of the antidote to the massive spread of misinformation. We all know how quickly falsehoods can spread on the Internet. Now readers have a simple tool to fight back with facts.”

For more information,  go to www.sharethefacts.org

Comments closed

At Tech & Check, some new ideas to automate fact-checking

Last week, journalists and technologists gathered at Duke to dream up new ways that automation could help fact-checking.

The first Tech & Check conference, sponsored by the Duke Reporters’ Lab and Poynter’s International Fact-Checking Network, brought together about 50 journalists, students and computer scientists. The goal was to showcase existing projects and inspire new ones.

Tech and Check photo
At Tech & Check, groups of students, journalists and technologists dreamed up new ideas to automate fact-checking.

The participants included representatives of Google, IBM, NBC News, PolitiFact, Full Fact, FactCheck.org and the WRAL-TV. From the academic side, we had faculty and Ph.D students from Duke, the University of North Carolina, University of Texas-Arlington, Indiana University and the University of Michigan.

The first day featured presentations about existing projects that automate some aspect of fact-checking; the second day, attendees formed groups to conceive new projects.

The presentations showcased a wide variety of tools and research projects. Will Moy of the British site Full Fact did a demo of his claim monitoring tool that tracks the frequency of talking points, showing how often politicians said the phrase over time. Naeemul Hassan of the University of Texas at Arlington showed ClaimBuster, a project I’ve worked on, that can ingest huge amounts of text and identify factual claims that journalists might want to check.

IBM’s Ben Fletcher showed one of the company’s new projects known as Watson Angles, a tool that extracts information from Web articles and distills it into a summary that includes key players and a timeline of events. Giovanni Luca Ciampaglia, a researcher at Indiana University, showed a project that uses Wikipedia to fact-check claims.

On the second day, we focused on the future. The attendees broke into groups to come up with new ideas for research. The groups had 75 minutes to create three ideas for tools or further research. The projects showed the many ways that automation can help fact-checking.

One promising idea was dubbed “Parrot Score,” a website that could build on the approach that Full Fact is exploring for claim monitoring. It would track the frequency of claims and then calculate a score for politicians who use canned phrases more often. Tyler Dukes, a data journalist from WRAL-TV in Raleigh, N.C., said Parrot Score could be a browser extension that showed the origin of a claim and then tracked it through the political ecosystem.

Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.
Despite the focus on the digital future of journalism, we used Sharpies and a lot of Post-It notes.

Two teams proposed variations of a “Check This First” button that would allow people to verify the accuracy of a URL before they post it on Facebook or in a chat. One team dubbed it “ChatBot.” Clicking it would bring up information that would help users determine if the article was reliable.

Another team was assigned to focus on ways to improve public trust in fact-checkers. The team came up with several interesting ideas, including more transparency about the collective ratings for individual writers and editors as well as a game app that would simulate the process that journalists use to fact-check a claim. The app could improve trust by giving people an opportunity to form their own conclusions as well as demonstrating the difficult work that fact-checkers do.

Another team, which was focused on fact-checker tools, came up with some interesting ideas for tools. One would automatically detect when the journalists were examining a claim they had checked before.  Another tool would be something of a “sentence finisher” that, when a journalist began typing something such as “The unemployment rate last month…” would finish the sentence with the correct number.

The conference left me quite optimistic about the potential for more collaboration between computer scientists and fact-checkers. Things that never seemed possible, such as checking claims against the massive Wikipedia database, are increasingly doable. And many technologists are interested in doing research and creating products to help fact-checking.

Comments closed

Public radio listeners want more fact-checking in election coverage

What do politically minded news junkies want from their election coverage? If they’re anything like NPR’s audience, they want fact-checking.

Last November, when the public radio company asked a sample of its audience about their interest in different kinds of political stories, 96 percent said they wanted stories that verified what the candidates said. Seventy-seven percent said they were very interested in fact-checks and 19 percent said they were somewhat interested.

"Sesame Street News"
Kermit covers breaking news about Humpty Dumpty in a report for “Sesame Street News.” But a survey suggests public media audiences would rather have fact-checking. (Screen shot via sesamestreet.org)

But the survey has yet to translate into much on-air fact-checking, especially at the state and local level, where public media stations are hardly playing a leading role in the growing trend of checking politicians’ statements.

The Reporters’ Lab international database of fact-checkers currently counts more than 40 active projects in the United States. Of those, 14 are affiliated with radio or TV news companies. But only two are public broadcasters — PolitiFact California, which is run by Capital Public Radio in Sacramento, and NPR, which launched a new fact-checking feature called “Break It Down” last fall. A third, Minnesota Public Radio’s PoliGraph, has been inactive since June. Beyond public radio and public television, other non-profit media fact-checkers at the local level include Michigan Truth Squad from the Center for Michigan’s Bridge magazine and the digital news site Voice of San Diego.

The low public media numbers are surprising since NPR’s audience research found that few other political news stories resonated as much with its listeners as fact-checks do. Only actual election results did better in the survey, with 97 percent saying they cared about those stories, while 95 percent said they were most interested in reports comparing candidates’ positions.

By contrast, less than half of those who answered had much interest in the latest polls or fundraising reports — two staples of most political reporting diets.

The PowerPoint slide below breaks down the survey answers in more detail. The 362 people who answered were selected from a much larger pool of loyal NPR listeners — people from the network’s radio and digital audience who volunteer to provide feedback. My former colleagues at NPR, where I previously was managing editor for digital news, kindly shared the audience feedback with the Reporters’ Lab, which tracks the growth and impact of fact-checking.

NPR election news survey
Answers from a November 2015 survey asking an NPR audience panel about election coverage.

The fact-checking numbers explain why NPR expanded its occasional fact-checking efforts for the 2016 election cycle. The numbers also reinforce the answers NPR heard four years ago, when it asked its audience a similar question and got a strikingly similar answer.

Yet even with such consistent interest, public broadcasters have taken a back seat to other media outlets in trying to verify political claims — a topic I discussed in an interview on a recent episode of The Pub, a weekly podcast about the public media business.

In truth, fact-checking is a tough beat for typical public media stations, especially those with limited reporting and editing staffs. The reporting process is time-consuming and intensive. And the results are likely to anger the most partisan elements of the audience. That’s no easy thing when you depend on listener and viewer donations and, in some communities, taxpayer support.

But there are upsides for local stations, too, including the ability to concentrate limited journalism resources on stories the audience says it eagerly wants. Fact-checks can also distinguish public broadcasters’ election in competitive media markets — unless the competition distinguishes itself first.

For now, commercial TV news outlets seem to be beating public broadcasters to those benefits. Nine of the active state and local fact-checking operations in the United States are affiliated with commercial TV stations. That includes four new PolitiFact state affiliates (Arizona, Colorado, Nevada and Ohio) that launched or relaunched in recent months as part of a partnership between the national fact-checker and the Scripps TV Station Group.

Commercial TV faces some of the same practical challenges that keep many public media outlets from taking on the truth beat. If anything, given the dependence on political advertising dollars at most commercial TV stations, you might even think those outlets would have far more to lose than public broadcasters. But the public broadcasters seem to be the ones who are losing out.

(As is only appropriate for an article about fact-checking, this post was updated shortly after it was published to correct two numbers: In NPR’s survey 77 percent said they were very interested in fact-checks and 19 percent said they were somewhat interested. Corrections always welcome here!)

Comments closed

Global fact-checking up 50% in past year

The high volume of political truth-twisting is driving demand for political fact-checkers around the world, with the number of fact-checking sites up 50 percent since last year.

The Duke Reporters’ Lab annual census of international fact-checking currently counts 96 active projects in 37 countries. That’s up from 64 active fact-checkers in the 2015 count. (Map and List)

Active Fact-checkers 2016A bumper crop of new fact-checkers across the Western Hemisphere helped increase the ranks of journalists and government watchdogs who verify the accuracy of public statements and track political promises. The new sites include 14 in the United States, two in Canada as well as seven additional fact-checkers in Latin America.There also were new projects in 10 other countries, from North Africa to Central Europe to East Asia.

With this dramatic growth, politicians in at least nine countries will have their statements scrutinized before their voters go to the polls for national elections this year. (In 2015, fact-checkers were on the beat for national elections in 11 countries.)

Active fact-checkers by continent in our latest tally:
Africa: 5
Asia: 7
Australia: 2
Europe: 27
North America: 47
South America: 8

More than a third of the currently active fact-checkers (33 of 96) launched in 2015 or even in the first weeks of 2016.

The Reporters’ Lab also keeps tabs on inactive fact-checking ventures, which currently number 47. Some of them assure us they are in suspended animation between election cycles — a regular pattern that keeps the fact-checking tally in continuous flux. At least a few inactive fact-checkers in the United States have been “seasonal” projects in past elections. The Reporters’ Lab regularly updates the database, so the tallies reported here are all as of Feb. 15, 2016.

Growing Competition

U.S. fact-checkers dominate the Reporters’ Lab list, with 41 active projects. Of these, three-quarters (30 of 41) are focused on the statements of candidates and government officials working at the state and local level. And 15 of those are among the local media organizations that have joined an expanding network of state affiliates of PolitiFact, the Pulitzer Prize-winning venture started nine years ago by the Tampa Bay Times in St. Petersburg, Florida.

(Editor’s Note: PolitiFact founder Bill Adair is a Duke professor who oversees the Reporters’ Lab work. The Lab is part of the the DeWitt Wallace Center for Media & Democracy at Duke’s Sanford School of Public Policy.)

In the past year, PolitiFact’s newspaper and local broadcast partners have launched new regional sites in six states (Arizona, California, Colorado, Iowa, Missouri and Nevada) and reactivated a dormant one in a seventh state (Ohio).

In some cases, those new fact-checkers are entering competitive markets. So far this election year, at least seven U.S. states have more than one regional fact-checker and in California there are three.

With the presidential campaign underway, competition also is increasing at the national level, where longstanding fact-checkers such as FactCheck.org, PolitiFact and the Washington Post Fact Checker now regularly square off with at least eight teams of journalists who are systematically scrutinizing the the candidates’ words. And with more and more newsrooms joining in, especially on debate nights, we will be adding to that list before the pixels dry on this blog post.

Competition is on the rise around the world, too. In 10 other countries, voters have more than one active fact-checker to consult.

The tally by country:
U.S.: 41
France: 5
U.K.: 4
Brazil: 3
Canada: 3
South Korea: 3
Spain: 3
Argentina: 2
Australia: 2
Tunisia: 2*
Ukraine: 2

* One organization in Tunisia maintains two sites that track political promises (a third site operated by the same group is inactive).

The growing numbers have even spawned a new global association, the International Fact-Checking Network hosted by the Poynter Institute, a media training center in St. Petersburg, Florida.

Promises, Promises

Some of the growth has come in the form of promise-tracking. Since January 2015, fact-checkers launched six sites in five countries devoted to tracking the status of pledges candidates and party leaders made in political campaigns. In Tunisia, there are two new sites dedicated to promise-tracking — one devoted to the country’s president and the other to its prime minister.

There are another 20 active fact-checkers elsewhere that track promises, either as their primary mission or as part of a broader portfolio of political verification. Added together, more than a quarter of the active fact-checkers (26 of 96, including nine in the United States) do some form of promise-tracking.

The Media Is the Mainstream — Especially in the U.S.

Nearly two-thirds of the active fact-checkers (61 of 96, or 64 percent) are directly affiliated with a new organization. However this breakdown reflects the dominant business structure in the United States, where 90 percent of fact-checkers are part of a news organization. That includes nine of 11 national projects and 28 of 30 state/local fact-checkers

Media Affiliations of 41 Active U.S. Fact-Checkers
Newspaper: 18
TV: 10
TV + Newspaper: 1
Radio: 3
Digital: 3
Student Newspaper: 1
Not Affiliated: 4

The story is different outside the United States, where less than half of the active fact-checking projects (24 of 55, or 44 percent) are affiliated with news organizations.

The other fact-checkers are typically associated with non-governmental, non-profit and activist groups focused on civic engagement, government transparency and accountability. A handful are partisan, especially in conflict zones and in countries where the lines between independent media, activists and opposition parties are often blurry and where those groups are aligned against state-controlled media or other governmental and partisan entities.

Many of the fact-checkers that are not affiliated with news organizations have journalists on their staff or partner with professional news outlets to distribute their content.

All About Ratings

More than three out of four active U.S. fact-checkers (33 of 41, or 81 percent) use rating systems, including scales that range from true to false or rating devices, such as the Washington Post’s “Pinocchios.” That pattern is consistent globally, where 76 of 96, or 79 percent, use ratings.

This report is based on research compiled in part by Reporters’ Lab student researchers Jillian Apel, Julia Donheiser and Shaker Samman. Alexios Mantzarlis of the Poynter Institute’s International Fact-Checking Network (and a former managing editor of the Italian fact-checking Pagella Politica) also contributed to this report, as did  Reporters’ Lab director Bill Adair, Knight Professor for the Practice of Journalism and Public Policy at Duke University (and founder of PolitiFact).

Please send updates and additions to Reporters’ Lab co-director Mark Stencel (mark.stencel@duke.edu).

Comments closed

Fact-checkers spin-up for presidential debates

Fact-checking season is underway, and some new players are getting into the act.

FiveThirtyEight, NPR, Vox and Politico unveiled new fact-checking features for the presidential debates that began last month. Others revived their truth-seeking teams, joining usual suspects such as FactCheck.org, the Washington Post and PolitiFact in their perennial efforts to verify what politicians are saying.

The fact-checkers often focus on the same claims, but coverage from last week’s Republican debates in California showed the varying ways they use to explain their findings. In its coverage, CNN rated statements on a scale similar to PolitiFact’s Truth-O-Meter, while the New York Times and NPR chose to work without a grading system similar to the FactCheck.org model.

CNN fact-checking
CNN said its Fact-Checking Team “picked the juiciest statements, analyzed them, consulted issue experts and then rated them.”

As in last month’s first debates, hosted by Fox News, the Post set aside its four-Pinocchio scale, offering a single scrolling summary of multiple fact-checks before following up additional posts in its usual style. Politico’s Wrongometer, CNN and NPR used similar models. Others posted individual items about specific claims or packaged a number of individually linkable fact-checks together as a combined reading experience. There also were efforts to do some real-time fact-checking while the debates were underway.

Here’s a roundup from last week’s two-round Republican debate, which included a primetime showdown with 11 candidates and an earlier session with four others:

CNN: The debate host’s “Fact-Checking Team” checked 16 claims and awarded them rulings from “True” to “It’s Complicated” to “False.” The “It’s Complicated” rating was awarded to Kentucky Sen. Rand Paul, who said Saudi Arabia was not accepting any Syrian refugees, and Texas Sen. Ted Cruz, for statements he made regarding the Iran nuclear agreement.

NPR: The radio network fact-checked four claims as part of its new “Break it Down” segment — all involving statements by or in response to Donald Trump. The claims ranged from the real estate developer’s lobbying for casinos in Florida to the safety of vaccination. NPR didn’t rate the claims on a scale and instead explained the validity of comments.

New York Times: The Times examined 11 claims, including topics from Planned Parenthood to immigration policy. Like NPR, the Times did not use a rating system. They did, however, post their fact-checks during the debate as part of their live coverage. Many of their checks focused on Trump and Ben Carson, the retired pediatric neurosurgeon whose outsider status had helped him climb up in the polls after the August debate on Fox News.

Politico: The Agenda, Politico’s policy channel, applied its Wrongometer to 12 claims, focusing on topics such as Trump’s bankruptcy and President Obama’s nuclear agreement with Iran. The group also scrutinized former Hewlett-Packard CEO Carly Fiorina’s remarks about Syria and a much-repeated Columbine myth. Despite its Wrongometer header, Politico’s fact-checkers do not use a rating system.

Vox: Rather than the relatively short, just-the-facts summations most other fact-checkers posted, Vox penned full-length commentaries on a handful of claims. Two featured statements by Fiorina (one about Planned Parenthood, linked here, and another on her time at HP), and one checked the candidates’ views on vaccinations. No rating was used.

AP: The news service fact-checked five claims, including statements from Fiorina on Planned Parenthood and the effects of Trump’s plan for an economic “uncoupling” from China. The AP did not use a system to rate these claims.

FiveThirtyEight: The site did its fact-checking in its debate live blog. FiveThirtyEight’s staff did not use any sort of rating system in its real-time reviews of the candidates’ statements, such as Trump’s claim about Fiorina’s track record as CEO of HP and President Obama’s likability overseas.

FactCheck.org: The fact-checkers based at the University of Pennsylvania’s Annenberg Public Policy Center reviewed 14 claims from the debates. FactCheck.org did not rate the claims, which included former Arkansas Gov. Mike Huckabee’s statements about Hillary Clinton’s email scandal to Trump’s comments on Wisconsin’s budget under Gov. Scott Walker.

PolitiFact: Run by the Tampa Bay Times, Washington-based PolitiFact fact-checked 15 debate claims so far, and awarded them rulings from “Pants on Fire” to “True.” The “Pants on Fire” rating went to Carson, who said that many pediatricians recognize the potential harm from too many vaccines. They also awarded a “True” rating to Fiorina’s statement regarding the potency of marijuana. While the debate was underway, the PolitiFact staff tapped their archive of previous calls to live blog the event.

The Washington Post Fact Checker: The Post’s two-person fact-checking team reviewed 18 claims in a roundup that included Trump’s denial that he’d ever gone bankrupt and New Jersey’s Gov. Chris Christie’s story about being named U.S. attorney by President George W. Bush on Sept. 10, 2001. The fact-checkers also posted versions of those items in the Post’s debate-night live blog. Following its usual practice for debates, the Post did not use its Pinocchio system to rate these claims. But since the debate, the Post added more Pinocchio-based fact-checks, including items on Fiorina’s criticisms of veterans’ health care (two Pinocchios) and Rubio’s comments on North Korea’s nuclear capabilities (one Pinocchio). Notably both of those items were suggested by Post readers.

Comments closed

Students selected for research work at Duke Reporters’ Lab

Student researchers play leading roles at the Duke Reporters’ Lab, experimenting with new forms of storytelling and exploring the state of newsroom innovation.

With the start of a new academic year, a team of eight students are donning white lab coats to help us map the future of journalism. Their involvement is one of the things that makes the Lab such a lively place (especially for this Duke newcomer).

These students will investigate ways to create new “structured” story forms that allow journalists to present information in engaging, digital-friendly ways. They also will track and help foster the work of political fact-checkers that are holding politicians around the world accountable for their statements and their promises.

We’ve just completed hiring our 2015-2016 team:

Natalie Ritchie: Over the summer, Natalie was a reporter for Structured Stories NYC — the Reporters’ Lab effort to test a new storytelling tool in the wilds of New York politics. She is co-editor in chief of the Duke Political Review. A public policy senior with a focus on international affairs, Natalie previously interned with the Senate Foreign Relations Committee, worked as a student communications assistant for the Duke Global Health Institute, and taught English to Iraqi, Palestinian, and Syrian refugees in Jordan. In addition, she interned for Republican Sen. Bob Corker of Tennessee, her home state.

Ryan Hoerger: The sports editor of The Chronicle, Duke’s student newspaper, is a senior from California double-majoring in public policy and economics. Last summer Ryan covered financial markets as an intern for Bloomberg. Before that, he interned for Duke magazine and conducted policy research during a summer stint at FasterCures. He is currently finishing up an undergraduate honors thesis that examines federal incentives for pharmaceutical research and development.

Shannon Beckham: Shannon, a public policy senior from Arizona, has seen how political fact-checking works from both sides of the process, having interned in the White House speechwriting office and at PolitiFact, the Pulitzer-winning service run by the Tampa Bay Times. She worked for the Chequeado fact-checking site in Buenos Aires, where she assisted with a 2014 meeting of Latin American fact-checkers. At the Reporters Lab, she helped start our database of fact-checking sites and organize the first Global Fact-Checking Summit last year in London.

Gautam Hathi: A junior in computer science who grew up near the Microsoft campus in Redmond, Wash., Gautam is already working at the intersection of news and technology. Having interned for Google and 3Sharp, the computer science major is now the digital content director for The Chronicle at Duke. He previously was The Chronicle’s health and science editor and is a contributing editor for the Duke Political Review.

Shaker Samman: Shaker is a public policy junior from Michigan. At the Reporters’ Lab, he worked on fact-checking and structured journalism prototypes and co-authored a PolitifFact story on the North Carolina Senate race with Lab co-director Bill Adair. He has interned as a reporter for the Tampa Bay Times in Florida and The Times Herald in Port Huron, Mich., where he also worked on his high school radio station.

Claire Ballentine: Claire is head of the university news department at The Chronicle. She began working for the Lab last year, helping update our database of political fact-checkers. The sophomore from Tennessee also has blogged for Her Campus and worked as an editing intern for the Great Smoky Mountains National Park Association. She was the editor-in-chief of her high school yearbook.

Jillian Apel: Jill brings an eye for visual storytelling to the Lab. A sophomore from California with a passion for writing as well, she was the managing editor of the student newspaper at the Brentwood School in Los Angeles.

Julia Donheiser: Julia’s data savvy comes via a social science research project she started as a student at the Bronx High School of Science. With guidance from a pair of educational psychologists, she crunched statewide numbers from school districts across New York to investigate the effects of various social factors on diagnosis rates for autism and learning disabilities. Now a freshman at Duke, she worked on the student newspaper at her high school. She also wrote a food blog that will make you hungry.

Comments closed

Study explores new questions about quality of global fact-checking

How long should fact-checks be? How should they attribute their sources — with links or a detailed list? Should they provide a thorough account of a fact-checker’s work or distill it into a short summary?

Those are just a few of the areas explored in a fascinating new study by Lucas Graves, a journalism professor at the University of Wisconsin. He presented a summary of his research last month at the 2015 Global Fact-Checking Summit in London.

Lucas Graves
Lucas Graves

The pilot project represents the first in-depth qualitative analysis of global fact-checking. It was funded by the Omidyar Network as part of its grant to the Poynter Institute to create a new fact-checking organization. The study, done in conjunction with the Reporters’ Lab, lays the groundwork for a more extensive analysis of additional sites in the future.

The findings reveal that fact-checking is still a new form of journalism with few established customs or practices. Some fact-checkers write long articles with lots of quotes to back up their work. Others distill their findings into short articles without any quotes. Graves did not take a position on which approach is best, but his research gives fact-checkers some valuable data to begin discussions about how to improve their journalism.

Graves and three research assistants examined 10 fact-checking articles from each of six different sites: Africa Check, Full Fact in the United Kingdom, FactChecker.in in India, PolitiFact in the United States, El Sabueso in Mexico and UYCheck in Uruguay. The sites were chosen to reflect a wide range of global fact-checking, as this table shows:

chartGraves and his researchers found a surprising range in the length of the fact-checking articles. UYCheck from Uruguay had the longest articles, with an average word count of 1,148, followed by Africa Check at 1,009 and PolitiFact at 983.

The shortest were from Full Fact, which averaged just 354 words. They reflected a very different approach by the British team. Rather than lay out the factual claims and back them up with extensive quotes the way most other sites do, the Full Fact approach is to distill them down to summaries.

Screen Shot 2015-08-11 at 3.37.21 PM

Graves also found a wide range of data visualization in the articles sampled for each site. For example, Africa Check had three data visualizations in its 10 articles, while there were 11 in the Indian site FactChecker.in.

Graves found some sites used lots of data visualizations; others used relatively few.
Graves found some sites used lots of data visualizations; others used relatively few.

The Latin American sites UYCheck and El Sabueso used the most infographics, while the other sites relied more on charts and tables.

Graves also found a wide range in the use of web links and quotes. Africa Check averaged the highest total of web links and quotes per story (18), followed by 12 for PolitiFact, while UYCheck and El Sabueso had the fewest (8 and 5, respectively). Full Fact had no quotes in the 10 articles Graves examined but used an average of 9 links per article.

Graves and his researchers also examined how fact-checkers use links and quotes — whether they were used to provide political context about the claim being checked, to explain the subject being analyzed or to provide evidence about whether the claim was accurate. They found some sites, such as Africa Check and PolitiFact, used links more to provide context for the claim, while UYCheck and El Sabueso used them more for evidence in supporting a conclusion.

The analysis of quotes yielded some interesting results. PolitiFact used the most in the 10 articles — 38 quotes — with its largest share from evidentiary uses. Full Fact used the fewest (zero), followed by UYCheck (23) and El Sabueso (26).

The study also examined what Graves called “synthetic” sources — the different authoritative sources used to explain an issue and decide the accuracy of a claim. This part of the analysis distilled a final list of institutional sources for each fact-check, regardless of whether sources were directly quoted or linked to. AfricaCheck led the list with almost nine different authoritative sources considered on average, more than twice as many as FactChecker.in and UYCheck. Full Fact, UYCheck, and El Sabueso relied mainly on government agencies and data, while PolitiFact and Africa Check drew heavily on NGOs and academic experts in addition to official data.

The study raises some important questions for fact-checkers discuss. Are we writing are fact-checks too long? Too short?

Are we using enough data visualizations to help readers? Should we take the time to create more infographics instead of simple charts and tables?

What do we need to do to give our fact-checks authority? Are links sufficient? Or should we also include quotes from experts?

Over the next few months, we’ll have plenty to discuss.

Comments closed