Africa Check rating

The number of fact-checkers around the world: 156… and growing

Collaboration, aggregation and networks add to the Reporters' Lab ongoing survey of fact-checking projects in more than 50 countries.

By Mark Stencel – August 7, 2018 | Print this article

The number of active fact-checking projects around the world now stands at 156, with steady growth driven by expanding networks and new media partnerships that focus on holding public figures and organizations accountable for what they say.

And elections this year in the United States and around the globe mean that number will likely increase even more by the time the Duke Reporters’ Lab publishes its annual census early next year. Our map of the fact-checkers now shows them in 55 countries.

There were 149 active fact-checking ventures in the annual summary we published in February, up from 44 when we started this count in 2014. And after this summer’s Global Fact summit in Rome — where the attendee list topped 200 and the waitlist was more than three times as long — we still have plenty of other possible additions to vet and review in the coming weeks. So check back for updates.

Among the most recent additions is Faktiskt, a Swedish media partnership that aggregates reporting from five news organizations — two newspapers, two public broadcasters and a digital news service. We’ve seen other aggregation partnerships like this elsewhere, such as Faktenfinder in Germany and SNU FactCheck in South Korea. (This is a different model from the similarly named Faktisk partnership in Norway, where six news organizations operate a jointly funded fact-checking team whose work is made freely available as a public service to other media in the country.)

As we prepare for our annual fact-checking census, we plan to look more closely at the output of each contributor to these aggregation networks to see which of them we should also count as standalone fact-checkers. Our goal is to represent the full range of independent and journalistic fact-checking, including clusters of projects in particular countries and local regions, as well as ventures that find ways to operate across borders.

Along those lines, we also added checkmarks to our map for Africa Check‘s offices in Kenya and Nigeria. We had done the same previously for the South Africa-based project’s office in Senegal, which covers francophone countries in West Africa. The new additions have been around awhile too: The Kenya office has been in business since late 2016 and the Nigeria office opened two months later.

Meanwhile, our friends at Africa Check regularly help us identify other standalone fact-checking projects, including two more new additions to our database: Dubawa in Nigeria and ZimFact in Zimbabwe. The fast growth of fact-checking across Africa is one reason the International Fact-Checking Network’s sixth Global Fact summit will be in Cape Town next summer.

One legacy of these yearly summits is IFCN’s code of principles, and the code has established an independent evaluation process to certify that each of its signatories adheres to those ethical and journalistic standards. Our database includes all 58 signatories, including the U.S.-based (but Belgium-born) hoax-busting site Lead Stories; Maldita’s “Maldito Bulo” (or “Damned Hoax”) in Spain; and the “cek facta” section of the Indonesian digital news portal Liputan6. All three are among our latest additions.

There’s more to come from us. We plan to issue monthly updates as we try to keep our heads and arms around this fast-growing journalism movement. I’ll be relying heavily on Reporters’ Lab student researcher Daniela Flamini, who has just returned from a summer fact-checking internship at Chequeado in Argentina. Daniela takes over from recently graduated researcher Riley Griffin, who helped maintain our database for the past year.

Take a look at the criteria we use to select the fact-checkers we include in this database and let us know if you have any additions to suggest.

Back to top

Reporters’ Lab joins N&O, UNC Reese News Lab on major fact-checking project

With grant from the N.C. Local News Lab Fund, partnership will expand non-partisan fact-checking throughout the state

By Catherine Clabby – August 1, 2018 | Print this article

The Duke Reporters’ Lab is joining McClatchy Carolinas and the UNC Reese News Lab in an ambitious project to expand non-partisan fact-checking throughout North Carolina.

With a $50,000 grant from the North Carolina Local News Lab Fund, the North Carolina Fact-Checking Project will build on the existing work at The News & Observer, add the work of student journalists and take advantage of new automated tools from the Duke Tech & Check Cooperative.

The project will evaluate statements by state and federal candidates in the 2018 election as well as lawmakers in the General Assembly session that begins in January 2019. The fact-checks will be produced by N&O journalists as part of PolitiFact North Carolina and will be made available for free to any news organization in the state for use online and in print.

The project will get support from the new TruthBuzz program of the International Center for Journalists (ICFJ), which is hiring an engagement fellow based on Raleigh to promote the North Carolina fact-checking.

The North Carolina Fact-Checking Project will put special emphasis on claims by politicians in rural parts of the state. Students in the Reporters’ Lab will scour news coverage and campaign ads for factual claims made by state, local and congressional candidates. The Lab will build new versions of its Tech & Check Alerts that use automated bots to find statements by politicians in social media that could be of interest to the North Carolina fact-checkers.

The Duke students and bots will provide daily suggestions of possible claims to The News & Observer, which will select which statements to research.

The UNC Reese News Lab will co-host a student seminar on fact-checking and help select a student journalist to work on the project. Representatives from Duke, TruthBuzz and the News & Observer will hold outreach sessions around the state to promote fact-checking and encourage news organizations to publish the project’s work.

About the partners:

The North Carolina Local News Lab Fund is a collaborative fund at the North Carolina Community Foundation established by a group of local and national funders who believe in the power of local journalism, local stories, and local people to strengthen our democracy.

The Duke Reporters’ Lab at the Sanford School of Public Policy is a center of research on fact-checking and automated journalism. The Lab tracks the growth of fact-checking around the world, conducts studies on important topics and develops tools to help journalists.

McClatchy Carolinas is the McClatchy division that publishes three newspapers in North Carolina, The News & Observer, The Charlotte Observer and The Herald-Sun. The N&O has a strong team of political reporters and has been the state’s PolitiFact partner for the last two years.

The Reese News Lab is an experimental media and research project based at the School of Media and Journalism at the University of North Carolina-Chapel Hill.

 

Back to top

Catherine Clabby joins the Duke Reporters’ Lab

The veteran journalist will manage student research projects, including the Tech & Check Cooperative.

By Bill Adair – July 30, 2018 | Print this article

Catherine Clabby, an award-winning reporter and editor, has been named the new research and communications manager in the Duke Reporters’ Lab. In that role, Clabby will help direct student research on political fact-checking and automated journalism, including the Tech & Check Cooperative.

In addition to her work in the Lab, Clabby will teach Newswriting and Reporting (PJMS 367), a core course in the journalism program in the DeWitt Wallace Center for Media & Democracy.

Clabby is a veteran journalist who most recently covered environmental health topics for the North Carolina Health News. Before that, she was the senior editor of the E.O. Wilson Life on Earth biology book series and a senior editor at American Scientist magazine.

From 1994 to 2007, she was a reporter at the Raleigh News & Observer where she covered science, medicine and a variety of state and local topics, including a U.S. Senate race. She left the paper in 2007 to take a year-long Knight Science Journalism fellowship at MIT.

Clabby lives in Durham with her husband, Christoph Guttentag, Duke’s dean of undergraduate admissions. Their daughter is a college student in Massachusetts.

 

Back to top

At Global Fact V: A celebration of community

More than 200 people attended the fifth meeting of the world's fact-checkers in Rome, which was organized by the International Fact-Checking Network.

By Bill Adair – June 25, 2018 | Print this article

My opening remarks at Global Fact V, the fifth annual meeting of the world’s fact-checkers, organized by the International Fact-Checking Network, held June 20-22 in Rome.

A couple of weeks ago, a photo from our first Global Fact showed up in my Facebook feed. Many of you will remember it: we had been all crammed into a classroom at the London School of Economics. When we went outside for a group photo, there were about 50 of us.

To show how our conference has grown, I posted that photo on Twitter along with one from our 2016 conference that had almost twice as many people. I also posted a third photo that showed thousands of people gathered in front of the Vatican. I said that was our projected crowd for this conference.

I rate that photo Mostly True.

What all of our conferences have in common is that they are really about community. It all began in that tiny classroom at the London School of Economics when we realized that whether we were from Italy or the U.K. or Egypt, we were all in this together. We discovered that even though we hadn’t talked much before or in many cases even met, we were facing the same challenges — fundraising and finding an audience and overcoming partisanship.

It was also a really powerful experience because we got a sense of how some fact-checkers around the world were struggling under difficult circumstances — under governments that provide little transparency, or, much worse, governments that oppress journalists and are hostile toward fact-checkers.

Throughout that first London conference there was an incredible sense of community. We’d never met before, but in just a couple of days we formed strong bonds. We vowed to keep in touch and keep talking and help each other.

It was an incredibly powerful experience for me. I was at a point in my career where I was trying to sort out what I would do in my new position in academia. I came back inspired and decided to start an association of fact-checkers – and hold these meetings every year.

The next year we started the IFCN and Poynter generously agreed to be its home. And then we hired Alexios as the leader.

Since then, there are have been two common themes. One you hear so often that it’s become my mantra: Fact-checking keeps growing. Our latest census of fact-checking in the Reporters’ Lab shows 149 active fact-checking projects and I’m glad to see that number keep going up and up.

The other theme, as I noted earlier, is community. I thought I’d focus this morning on a few examples.

Let’s start with Mexico, where more than 60 publishers, universities and civil society organizations have started Verificado 2018, a remarkable collaboration. It was originally focused largely on false news, but they’ve put more emphasis on fact-checking because of public demand. Daniel Funke wrote a great piece last week about how they checked a presidential debate.

In Norway, an extraordinary team of rivals has come together to create Faktisk, which is Norwegian for “actually” and “factually.” It launched nearly a year ago with four of the country’s biggest news organizations — VG, Dagbladet, NRK and TV 2 – and it’s grown since then. My colleague Mark Stencel likened it to the New York Times, The Washington Post and PBS launching a fact-checking project together.

 

At Duke, both of our big projects are possible because of the fact-checkers’ commitment to help each other. The first, Share the Facts and the creation of the ClaimReview schema, grew out of an idea from Glenn Kessler, the Washington Post Fact Checker, who suggested that Google put “fact-check” tags on search results.

That idea became our Duke-Google-Schema.org collaboration that created what many of you now use so search engines can find your work. And one unintended consequence: it makes automated fact-checking more possible. It all started because of one fact-checker’s sense of community.

Also, FactStream, the new app of our Tech & Check Cooperative, has been a remarkable collaboration between the big US fact-checkers — the Post, FactCheck.org and PolitiFact. All three took part in the beta test of the first version, our live coverage of the State of the Union address back in January. Getting them together on the same app was pretty remarkable. But our new version of the app –which we’re releasing this week – is even cooler. It’s like collaboration squared, or collaboration to the second power!

It took Glenn’s idea, which created the Share the Facts widget, and combined it with an idea from Eugene Kiely, the head of FactCheck.org, who said we should create a new feature on FactStream that shows the latest U.S. widgets every day.

So that’s what we did. And you know what: it’s a great new feature that reveals new things about our political discourse. Every day, it shows the latest fact-checks in a constant stream and users can click through, driving new traffic to the fact-checking sites. I’ll talk more about it during the automated demo session on Friday. But it wouldn’t be possible if it weren’t for the commitment to collaboration and community by Glenn and Eugene.

We’ve got a busy few days ahead, so let’s get on with it. There sure are a lot of you!

As we know from the photographs: fact-checking keeps growing.

 

Back to top

Helena Merk

Meet the bot builders: How our student team is automating fact-checkers’ work

A team of Duke students is building tools that automate the most tedious task for fact-checkers: finding claims to check

By Julianna Rennie – April 30, 2018 | Print this article

In a sunny corner office at Duke University, four students are building bots to do tasks that are too tedious for humans.

The project is part of the Duke Tech & Check Cooperative, a $1.2 million project to automate fact-checking. The students spend up to 10 hours each week in the Reporters’ Lab, a room in the Sanford School of Public Policy decorated with movie posters from All the President’s Men, Spotlight and The Post.

The bots are computer programs that perform the tasks often done by college interns. The programs scour long transcripts and articles online and identify sentences that journalists might want to fact-check.

The students are an eclectic bunch: a data enthusiast who guzzles nutritional drinks; a Cameron Crazy who is spending his summer solving computer science problems; a Silicon Valley resident who writes code to help animals; and a snowboarder who helps run student businesses.

Asa Royal

Asa Royal, the data lover

Asa Royal keeps track of his life in data. He tallies everything from the music he plays to the number of times he laughs at each television episode he watches.

Royal, a junior from St. Louis, Missouri, joined the Lab in September 2016, the height of the last election cycle, and spent four months helping with research on trends in campaign ads.

After deciding to major in computer science, Royal was tasked with figuring out how to automate the use of ClaimBuster, an algorithm developed at the University of Texas at Arlington that identifies sentences to check. His goal was to write programs that combed through dense material on the internet and submitted it to ClaimBuster without anyone having to read it.

“No one should ever read the Congressional Record,” he says. “There are about 200 pages produced per day. Nobody should be watching all 15 hours of CNN. These are problems we can solve.”

Royal built a bot that extracts content from CNN transcripts and runs it through ClaimBuster. Then, it generates a daily email of 15 checkable claims that is automatically sent to journalists. Now, he also gathers content from the Congressional Record.

Royal runs on Huel, a nutritionally complete powder that contains all of the proteins, carbohydrates, fats and 27 vitamins and minerals recommended for a healthy diet. When he first heard that some coders consume meal supplements so they don’t have to leave their computers, Royal decided to try it.

“A lot of people say it tastes like cardboard oatmeal, but I politely disagree,” he says. “It’s more like liquid porridge.”

Taking journalism courses and working at the Reporters’ Lab has changed Royal’s trajectory at Duke. Last summer, he interned at the Tampa Bay Times. After graduating, he hopes to go into computational journalism. “I realized this is what I want to code for,” he says.

Lucas Fagan

Lucas Fagan, the puzzler

Lucas Fagan is a political junkie. He joined the Lab so he could have a role in fact-checking and debunking fake news, which he says are critical in politics today.

Fagan, a first-year from Morristown, New Jersey, is building a bot to identify checkable claims from Facebook. The program will gather content from posts written by politicians in close races.

Fagan is considering majoring in computer science and mathematics. He writes for Duke Political Review and competes with the debate team. When he’s in the Lab, he enjoys playing devil’s advocate with other staff members about anything from Duke basketball to the Russia probe.

Though he enjoys interacting with other Lab students, Fagan finds he is most productive in his dorm, where he has multiple monitors set up for coding. “I honestly enjoy the work that we’re doing, so when I need a break from homework, I’ll do work for the Lab,” he says.

Fagan feeds off the problem-solving involved in coding. “I enjoy trying to face the CS challenges more than anything else,” he says. That’s why he’s staying in Durham this summer for Data and Technology for Fact-Checking, a 10-week research program through which he will tackle natural language processing and machine learning problems.

Helena Merk

Helena Merk, coding for causes

Helena Merk competes in hack-a-thons across the country, so she has ample experience designing creative tech projects. Though coding has many applications, she chooses to apply her skills to social causes.

Merk, a first-year from Palo Alto, California, is writing a program that would enable the Lab to send its daily list of checkable claims through Slack, a messaging tool utilized by newsrooms.

This year, Merk helped organize Duke Blueprint, a conference aiming “to inspire disruptive innovation for future-focused global change.” She also works remotely as the lead mobile app developer for AdoptMeApp, an app that connects shelter dogs with potential owners.

Merk applied to the Pratt School of Engineering, planning to study biomedical engineering. But in her first semester she realized there were other disciplines at Duke that could combine technology and health.

Now, Merk is taking computer science and global health classes. She’s spending most of the summer in Madagascar working with Duke engineers to provide water systems for local communities. “Computer science is powerful in how versatile it is,” she says. “I want to make things that help people.”

Naman Agarwal

Naman Agarwal, the entrepreneur

Naman Agarwal started coding in high school while he was interning for a local politician’s campaign. After spending weeks entering donor information into a spreadsheet, he built an app to automate the process.

Agarwal, a first-year from Palatine, Illinois, is now building a bot for Tech & Check to identify checkable tweets from politicians in competitive races.

Agarwal works at Campus Enterprises, a student-run LLC that provides food delivery, custom apparel and other services to Duke students. He is also a drummer and producer for Duke’s student-run record label, Small Town Records. On the weekends, he travels to snowboarding competitions with the Duke Ski Team.

Agarwal studies computer science and economics. He says he hopes to find a job that emulates his experience at the Lab. “I don’t want to contribute to work that’s just throwing money around,” he says. “I want to work at a company that has a soul.” (Photos by Evan Nicole Bell)

Back to top

Tech & Check Alerts

Tech & Check Alerts aim to ease the workload of fact-checkers

Student-created tool can peruse political transcripts and find claims most likely to contain falsehoods

By Sydney McKinney – April 6, 2018 | Print this article

Students in the Duke Reporters’ Lab have built a bot that is like an intern who watches TV around the clock.

Asa Royal, a junior at Duke University, and Lucas Fagan, a freshman, have created Tech & Check Alerts, a new tool in a series of innovations the Reporters’ Lab is creating to help simplify the fact-checking process.

Using Tech & Check Alerts, the Lab can identify check-worthy claims in television news transcripts and send them to fact-checkers in daily email alerts.

“We’re going to save fact-checkers a lot of time and help them find things that they would otherwise miss,” said Mark Stencel, co-director of the Reporters’ Lab.

Though the fact-checking industry is growing worldwide, the organizations doing that work are typically small, even one-person enterprises, and the workload can be burdensome. Fact-checkers often have to sift through pages of text to find claims to check. This time-consuming process can create a substantial time gap between when statements are made and when fact-checks are available to viewers or readers.

The Tech & Check Alerts automate that process. Royal and Fagan, who are both computer science majors, created a program that scans transcripts of TV news channels, such as CNN, for claims that fact-checkers may want to investigate. It then compiles the check-worthy claims and sends them in a daily email to fact-checkers at The Washington Post, PolitiFact, the Associated Press, FactCheck.org and The New York Times, among others. Thus far, there have been seven fact-checks performed based on these alerts.

“Journalists don’t have to watch 15 hours of CNN or read the entire congressional report,” Royal said. “We’ll do it for them.”

Royal and Fagan created Tech & Check Alerts using ClaimBuster, an algorithm created by computer scientist Chengkai Li from the University of Texas at Arlington. ClaimBuster scans blocks of text and identifies “check-worthy” claims, based on indicators such as past-tense verbs, numbers, dates or statistics. It ranks statements from 0 to 1.0 based on how likely they are to be checkable; any statements that score a 0.7 or higher are typically considered check-worthy.

According to Royal, Li’s technology had yet to be used much outside of academia, so leaders of the Tech & Check Cooperative decided to utilize it for daily alerts.

“There’s already software that can find factual claims, and there are already fact-checkers who can check them,” Royal said. “We’re just solving the last-mile problem.”

The creation of Tech & Check Alerts is an important step for the Duke Tech & Check Cooperative, a two-year research project funded by the John S. and James L. Knight Foundation, the Facebook Journalism Project and the Craig Newmark Foundation.

The broader purpose of this initiative is to bring together journalists, academics and computer scientists from across the country to innovate and automate the fact-checking industry. Over the course of two years, the Reporters’ Lab will develop tools that ease the job of fact-checkers and make fact-checking more accessible to consumers. Another tool the Lab is currently working on is FactStream, an app that provides instant fact-checking during live events.

Alongside other student researchers, Fagan and Royal are working to improve Tech & Check Alerts to include additional sources such as daily floor speeches and debates from the Congressional Record, and social media feeds from endangered incumbents running in this year’s closest House and Senate races. Fact-checkers will have input on how these additional alerts will be deployed.

Fagan is also building a web interface that would give fact-checking partners a way to dig deeper into these feeds and perhaps even customize certain alerts. Freshman Helena Merk, another student researcher in the Lab, is building a tool that would deliver the daily alerts directly to a channel on Slack, a communication platform used in many newsrooms.

Once these improvements are completed, and Tech & Check Alerts are deployed more widely, they should help fact-checkers across the country.

“This project is a stepping stone in our process of using real-time claims and existing fact-checks to automate fact-checking in real time,” Stencel said.

Back to top

Tech & Check Conference

Journalists, computer scientists gather for Tech & Check Conference at Duke

Members of the fact-checking community convened March 29-30 on Duke University's campus to tackle pressing issues

By Rebecca Iannucci – March 30, 2018 | Print this article

About 40 fact-checkers, journalists, computer scientists and academics gathered at Duke University March 29-30 for the Tech & Check Conference, a meeting hosted by the Reporters’ Lab.

As part of its Tech & Check Cooperative, the Reporters’ Lab is serving as a hub for automated fact-checking to connect journalists and technologists around the world. The conference gave them an opportunity to demonstrate current projects and discuss the big challenges of automation.

Some highlights of the conference:

Tech & Check Conference* Eleven demos of past and current projects.  Technologists and computer scientists showed off projects they’ve been developing to either automate fact-checking or improve the flow of accurate information on the internet.

Topics included new tools such as Chequeabot, an automated service that detects factual claims for the Argentinian fact-checker Chequeado; the Bad Idea Factory’s update of the Truth Goggles tool; and the perils of misinformation, including a real-life example from Penn State professor S. Shyam Sundar, whose research project about fake news was inaccurately described in widespread news coverage.

Tech & Check Conference

* Two Q&A panels. Alexios Mantzarlis, director of the International Fact-Checking Network, led a discussion with three fact-checkers about the potential tools and processes that could make fact-checking more efficient in the future.

Reporters’ Lab co-director Bill Adair moderated a conversation about challenges in automated fact-checking, including the pitfalls of voice-to-text technology and natural language processing.

Attendees also participated in breakout sessions to discuss ways to develop international standards and consistent terminology.

Photos by Colin Huth.

Back to top

FactoidL

Fact-checking browser extensions hold promise but need further development

NewsCracker and FactoidL are contributing to the fight against misinformation, but there's plenty of room for improvement

By Bill McCarthy – February 23, 2018 | Print this article

Two new fact-checking browser extensions are trying something really challenging: automating the fact-checking process. By generating algorithmic scores for news online, these extensions are predicting whether particular web pages are likely to be true or false. We wondered if these products could really provide such a critical service, so we ran an analysis. Our finding? They are ambitious, but they are not quite ready for prime time.

During the course of several weeks, we ran 219 stories from 73 different media organizations through these extensions — NewsCracker and FactoidL— and tracked the algorithmic scores assigned to each story. The stories ranged from hard news and long-form features to sports and entertainment.

NewsCracker

NewsCracker, founded and developed in 2017 by three 18-year-old college students, is available for download on the Chrome Web Store. According to its website, NewsCracker uses machine learning technology and statistical analysis “to contribute to the movement against ‘fake news’ by helping everyday Internet users think more critically about the articles they read.”

NewsCrackerNewsCracker does not promise the truth, but it does “come pretty close.” Web pages receive ratings on a one to 10 scale for headline strength, neutrality and accuracy, which are then averaged into one overall score. NewsCracker trusts the article when the overall score is above 8.0, and it does not trust the article when the score is below 6.0. Articles scoring between 6.0 and 8.0 trigger a cautionary warning.

According to NewsCracker’s website, ratings are generated according to several criteria, including preliminary scores assigned to specific websites, the number of news outlets reporting on the same story, the number and sourcing of quotations, the number of biased words or phrases and the sentence length and structure. To assess the validity of a story’s factual claims, NewsCracker identifies “the five most important factual claims” and checks for their repetition in related news coverage.

Of the 219 stories we tested, 145 received ratings above 8.0, 65 received ratings between 6.0 and 8.0 and seven received ratings below 6.0 — meaning 66 percent of stories were dubbed trustworthy while only 3 percent were labeled “fake news.” NewsCracker “could not detect any news to score” from the final two stories we tested, both of which came from The Chronicle at Duke University.

The Washington Post had the highest average overall score, at 9.4, with Reuters finishing not far behind. InfoWars, Twitchy and American Thinker recorded the lowest average overall scores.

Significantly, local and campus news organizations — including The Durham Herald-Sun, The Boston Globe and The Chronicle at Duke University — had average overall scores below known fake news producer YourNewsWire.com as well as several other hyperpartisan outlets, such as Breitbart News. This may be because local news coverage is not often repeated elsewhere.

Additionally, the methodology, through which five facts are cross-checked against other coverage, may have the effect of penalizing outlets for original reporting. One BuzzFeed News story — which cites several sources by name, directly references related coverage and was eventually picked up by The Washington Post — received a 5.6 accuracy rating on the grounds that “many claims could not be verified.”

FactoidL

FactoidL — a project from Rochester Institute of Technology student Alexander Kidd also available for download on the Chrome Web Store — does not promise much from its algorithm, which it calls “Anaxagoras.” In fact, the extension’s online description warns that it is “currently very hit-or-miss.”

According to its description, FactoidL “is meant to be a quick, automated fact-checking tool that compares sentences you read to another source.”

FactoidLFactoidL’s formula is simple. it identifies the number of fact-checkable statements — which it calls “factoids” — in any given story, and then Anaxagoras cleans each “factoid” by removing all “unimportant words” and queries Wikipedia for matches to the remaining words or phrases. For any web page, users can see the number and list of “factoids” as well as an accuracy percentage for the page.

This process is currently defective — most likely because only statements that align with Wikipedia descriptions are identified as true or accurate. The 219 stories tested turned out an average of approximately 60 factoids and an average accuracy percentage of approximately 0.9 percent. Of these 219 stories, 154 were rated as 0 percent accurate, while 12 were rated as 5 percent accurate or higher and only one was rated as high as 10 percent accurate.

The story with the highest number of “factoids” — from YourNewsWire.com — registered 2,645 “factoids,” but many could be discounted as claims that were not factual. FactoidL has a tendency, for example, to mark the dateline, byline and headline of a story as “factoids.” It often counts opinion statements, as well.

Where NewsCracker is not yet ready for prime time, FactoidL has a long way to go. Very few news articles from reputable journalistic outlets are actually less than 10 percent accurate. The fact that FactoidL rated all stories tested by the Lab as less than 10 percent accurate implies that the extension is not just “hit-or-miss” with its algorithm; it is missing every time.

The code powering FactoidL is available on GitHub, and interested parties can provide feedback or even volunteer to contribute.

The future is bright

Any new technology is going to hit some bumps along the way, with bugs and breakdowns to be expected. These young developers are trying something really ambitious in a way that is both innovative and exciting. We admire the spirit of their extensions and hope to see them developed further.

Back to top

Fact-checking census

Fact-checking triples over four years

The annual fact-checking census from the Reporters' Lab finds 31 percent growth in the past year alone, and signs that many verification projects are becoming more stable.

By Mark Stencel & Riley Griffin – February 22, 2018 | Print this article

The number of fact-checkers around the world has more than tripled over the past four years, increasing from 44 to 149 since the Duke Reporters’ Lab first began counting these projects in 2014 — a 239 percent increase. And many of those fact-checkers in 53 countries are also showing considerable staying power.

This is the fifth time the Reporters’ Lab has tallied up the organizations where reporters and researchers verify statements by public figures and organizations and keep tabs on other sources of misinformation, particularly social media. In each annual census, we have seen steady increases on almost every continent — and the past year was no different.

The 2018 global count is up by nearly a third (31 percent) over the 114 projects we included in last year’s census. While some of that year-over-year change comes because we discovered established fact-checking ventures that we hadn’t yet counted in our past surveys, we also added 21 fact-checking projects that launched since the start of 2017, including one — Tempo’s “Fakta atau Hoax” in Indonesia — that opened for business a month ago.

2018 fact-checking censusAnd that list of startups does not count one short-run fact-checking project — a TV series produced by public broadcaster NRK for Norway’s national election last year. That series is now among the 63 inactive fact-checkers we count on our regularly updated map, list and database. Faktisk, a Norwegian fact-checking partnership that several media companies launched in 2017, remains active.

Elections are often catalysts for political watchdog projects. In addition to the two Norwegian projects, national or regional voting helped spur new fact-checking efforts in Indonesia, South Korea, France, Germany and Chile.

Fact-Checkers By Continent
Africa:4
Asia: 22
Australia: 3
Europe : 52
North America: 53
South America: 15

Many of the fact-checkers we follow have shown remarkable longevity.

Based on the 143 projects whose launch dates we know for certain, 41 (29 percent) have been in business for more than five years. And a diverse group of six have already celebrated 10 years of nearly continuous operation — from 23-year-old Snopes.com, the grandparent of hoax-busting, to locally focused “Reality Checks” from  WISC-TV (News 3) in Madison, Wisconsin, which started fact-checking political statements in 2004. Some long-term projects have occasionally shuttered between election cycles before resuming their work. And some overcame significant funding gaps to come back from the dead.

On average, fact-checking organizations have been around four years.
One change we have noted over the past few years is some shifting in the kind of organizations that are involved in fact-checking and the way they do business. The U.S. fact-checker PolitiFact, for instance, began as an independent project of the for-profit Tampa Bay Times in 2007. With its recently announced move to Poynter Institute, a media training center in St. Petersburg, Florida, that is also the Times’ owner, PolitiFact now has nonprofit status and is no longer directly affiliated with a larger news company.

That’s unusual move for a project in the U.S., where most fact-checkers (41 of 47, or 87 percent) are directly affiliated with newspapers, television networks and other established news outlets. The opposite is the case outside the U.S., where a little more than half of the fact-checkers are directly affiliated (54 of 102, or 53 percent).

The non-media fact-checkers include projects that are affiliated with universities, think tanks and non-partisan watchdogs focused on government accountability. Others are independent, standalone fact-checkers, including a mix of nonprofit and commercial operations as well as a few that are primarily run by volunteers.

Fact-checkers, like other media outlets, are also seeking new ways to stay afloat — from individual donations and membership programs to syndication plans and contract research services. Facebook has enlisted fact-checkers in five countries to help with the social platform’s sometimes bumpy effort to identify and label false information that pollutes its News Feed. (Facebook also is a Reporter’s Lab funder, we should note.) And our Lab’s Google-supported Share the Facts project helped that company  elevate fact-checking on its news page and other platforms. That’s a development that creates larger audiences that are especially helpful to the big-media fact-checkers that depend heavily on digital ad revenue.

Growing Competition

The worldwide growth in fact-checking means more countries have multiple reporting teams keeping an ear out for claims that need their scrutiny.

Last year there were 11 countries with more than one active fact-checker. This year, we counted more than one fact-checker in 22 countries, and more than two in 11 countries.

Countries With More Than Two Fact-Checkers
United States: 47
Brazil: 8
France: 7
United Kingdom: 6
South Korea: 5
India: 4
Germany: 4
Ukraine: 4
Canada: 4
Italy: 3
Spain: 3

There’s also growing variety among the fact-checkers. Our database now includes several science fact-checkers, such as Climate Feedback at the University of California Merced’s Center for Climate Communication and Détecteur de Rumeurs from Agence Science-Presse in Montreal. Or there’s New York-based Gossip Cop, an entertainment news fact-checking site led since 2009 by a “reformed gossip columnist.” (Gossip Cop is also another example of a belated discovery that only appeared on our fact-checking radar in the past year.)

As the fact-checking community around the world has grown, so has the International Fact-Checking Network. Launched in 2015, it too is based at Poynter, the new nonprofit home of PolitiFact. The network has established a shared Code of Principles as well as a process for independent evaluators to verify its signatories’ compliance. So far, about a third of the fact-checkers counted in this census, 47 of 149, have been verified.

The IFCN also holds an annual conference for fact-checkers that is co-sponsored by the Reporters’ Lab. There is already a wait list of hundreds of people for this June’s gathering in Rome.

U.S. Fact-Checking

The United States still has far more fact-checkers than any other country, but growth in the U.S. was slower in 2017 than in the past. For the first time, we counted fewer fact-checkers in the United States (47) than there were in Europe (52).

While the U.S. count ticked up slightly from 43 a year ago, some of that increase came from the addition of newly added long-timers to our database — such as the Los Angeles Times, Newsweek magazine and the The Times-Union newspaper in Jacksonville, Florida. Another of those established additions was the first podcast in our database: “Science Vs.” But that was an import. “Science Vs.” began as a project at the Australian public broadcaster ABC in 2015 before it found its U.S. home a year later at Gimlet Media, a commercial podcasting company based in New York.

Among the new U.S. additions are two traditionally conservative media outlets: The Daily Caller (and its fact-checking offshoot Check Your Fact) and The Weekly Standard. To comply with the IFCN’s Code of Principles, both organizations have set up internal processes to insulate their fact-checkers from the reporting and commentary both publications are best known for.

Another new addition was the The Nevada Independent, a nonprofit news service that focuses on state politics. Of the 47 U.S. fact-checkers, 28 are regionally oriented, including the 11 state affiliates that partner with PolitiFact.

We originally expected the U.S. number would drop in a year between major elections, as we wrote in December, so the small uptick was a surprise. With this year’s upcoming midterm elections, we expect to see even more fact-checking in the U.S. in 2018.

The Reporters’ Lab is a project of the DeWitt Wallace Center for Media & Democracy at Duke University’s Sanford School for Public Policy. It is led by journalism professor Bill Adair, who was also PolitiFact’s founding editor. The Lab’s staff and student researchers identify and evaluate fact-checkers that specifically focus on the accuracy of statements by public figures and institutions in ways that are fair, nonpartisan and transparent. See this explainer about how we decide which fact-checkers to include in the database. In addition to studying the reach and impact of fact-checking, the Lab is home to the Tech & Check Cooperative, a multi-institutional project to develop automated reporting tools and applications that help fact-checkers spread their work to larger audiences more quickly.

Back to top

Pop-up fact-checking app Truth Goggles aims to challenge readers’ biases

Created by Dan Schultz, Truth Goggles is a browser plugin that creates a personalized "credibility layer" for users

By Julianna Rennie – February 21, 2018 | Print this article

Dan Schultz, a technologist building a new fact-checking app for the Reporters’ Lab, says the app should be like a drinking buddy.

“You can have a friend who you fundamentally disagree with on a lot of things, but are able to have a conversation,” Schultz says. “You’re not thinking of the other person as a spiteful jerk who’s trying to manipulate you.”

Truth Goggles
(L-R) Dan Schultz, with Bad Idea Factory’s Ted Han, Carolyn Rupar-Han and Lou Huang, who are working with Schultz to create Truth Goggles. Photo courtesy of Dan Schultz.

Schultz, 31, is using that approach to develop a new version of Truth Goggles, an app he first built eight years ago at the MIT Media Lab, for the Duke Tech & Check Cooperative. His goal is to get to know users and find the most effective way to show them fact-checks. While other Tech & Check apps take a traditional approach by providing Truth-O-Meter ratings or Pinocchios to all users, Schultz plans to experiment with customized formats. He hopes that personalizing the interface will attract new audiences who are put off by fact-checkers’ rating systems.

Truth Goggles is a browser plugin that automatically scans a page for content that users might want fact-checked. Schultz hopes that this unique “credibility layer” will be like a gentle nudge to get people to consider fact-checks.

“The goal is to help people think more carefully and ideally walk away with a more accurate worldview from their informational experiences,” he says.

As a graduate student at the Media Lab, Schultz examined how people interact with media. His 150-page thesis paper concluded that when people are consuming information, they are protecting their identities.

Schultz learned that a range of biases make people less likely to change their minds when exposed to new information. Most people simply are unaware of how to consume online content responsibly, he says.

He then set out to use technology to short-circuit biased behavior and help people critically engage with media. The first prototype of Truth Goggles used fact-checks from PolitiFact as a data source to screen questionable claims.

“The world will fall apart if we don’t improve the way information is consumed through technology.”

Schultz recently partnered with the Reporters’ Lab to resume working on Truth Goggles. This time, Truth Goggles will be integrated with Share the Facts, so it can access all fact-checking articles formatted using the ClaimReview schema.

Schultz also is exploring creative ways to present the information to users. He says the interface must be effective in impeding biases and enjoyable for people to use. As a graduate student, one of Schultz’s initial ideas was to highlight verified claims in green and falsehoods in red. But he quickly realized this solution was not nuanced enough.

“I don’t want people to believe something’s true because it’s green,” he says.

The new version of Truth Goggles will use information about users’ biases to craft messages that won’t trigger their defenses. But Schultz doesn’t know exactly what this will look like yet.

“Can we use interfaces to have a reader challenge their beliefs in ways that just a blunt presentation of information wouldn’t?” Schultz says. “If the medium is the message, how can we shape the way that message is received?”

Born in Cheltenham, Pennsylvania, Schultz studied information systems, computer science and math at Carnegie Mellon University. As a sophomore, he won the Knight News Challenge, which provides grants for “breakthrough ideas in news and information.”

The News Challenge put him “on the path toward eventually applying to the Media Lab and really digging in,” he says.

After graduating from MIT, Schultz worked as a Knight-Mozilla Fellow at the Boston Globe and then joined the Internet Archive, where his title is senior creative technologist. He continues to develop side projects such as Truth Goggles through the Bad Idea Factory, a company with a tongue-in-cheek name that he started with friends. He says the company’s goal is “to make people ‘thinking face’ emoji” by encouraging its technologists to try out creative ideas. With Truth Goggles, he hopes to get people who may not already consume fact-checking content to challenge their own biases.

“The world will fall apart if we don’t improve the way information is consumed through technology,” Schultz says. “It’s sort of like the future of the universe as we know it depends on solving some of these problems.”

Back to top