Press "Enter" to skip to content

Month: February 2018

Fact-checking browser extensions hold promise but need further development

Two new fact-checking browser extensions are trying something really challenging: automating the fact-checking process. By generating algorithmic scores for news online, these extensions are predicting whether particular web pages are likely to be true or false. We wondered if these products could really provide such a critical service, so we ran an analysis. Our finding? They are ambitious, but they are not quite ready for prime time.

During the course of several weeks, we ran 219 stories from 73 different media organizations through these extensions — NewsCracker and FactoidL— and tracked the algorithmic scores assigned to each story. The stories ranged from hard news and long-form features to sports and entertainment.

NewsCracker

NewsCracker, founded and developed in 2017 by three 18-year-old college students, is available for download on the Chrome Web Store. According to its website, NewsCracker uses machine learning technology and statistical analysis “to contribute to the movement against ‘fake news’ by helping everyday Internet users think more critically about the articles they read.”

NewsCrackerNewsCracker does not promise the truth, but it does “come pretty close.” Web pages receive ratings on a one to 10 scale for headline strength, neutrality and accuracy, which are then averaged into one overall score. NewsCracker trusts the article when the overall score is above 8.0, and it does not trust the article when the score is below 6.0. Articles scoring between 6.0 and 8.0 trigger a cautionary warning.

According to NewsCracker’s website, ratings are generated according to several criteria, including preliminary scores assigned to specific websites, the number of news outlets reporting on the same story, the number and sourcing of quotations, the number of biased words or phrases and the sentence length and structure. To assess the validity of a story’s factual claims, NewsCracker identifies “the five most important factual claims” and checks for their repetition in related news coverage.

Of the 219 stories we tested, 145 received ratings above 8.0, 65 received ratings between 6.0 and 8.0 and seven received ratings below 6.0 — meaning 66 percent of stories were dubbed trustworthy while only 3 percent were labeled “fake news.” NewsCracker “could not detect any news to score” from the final two stories we tested, both of which came from The Chronicle at Duke University.

The Washington Post had the highest average overall score, at 9.4, with Reuters finishing not far behind. InfoWars, Twitchy and American Thinker recorded the lowest average overall scores.

Significantly, local and campus news organizations — including The Durham Herald-Sun, The Boston Globe and The Chronicle at Duke University — had average overall scores below known fake news producer YourNewsWire.com as well as several other hyperpartisan outlets, such as Breitbart News. This may be because local news coverage is not often repeated elsewhere.

Additionally, the methodology, through which five facts are cross-checked against other coverage, may have the effect of penalizing outlets for original reporting. One BuzzFeed News story — which cites several sources by name, directly references related coverage and was eventually picked up by The Washington Post — received a 5.6 accuracy rating on the grounds that “many claims could not be verified.”

FactoidL

FactoidL — a project from Rochester Institute of Technology student Alexander Kidd also available for download on the Chrome Web Store — does not promise much from its algorithm, which it calls “Anaxagoras.” In fact, the extension’s online description warns that it is “currently very hit-or-miss.”

According to its description, FactoidL “is meant to be a quick, automated fact-checking tool that compares sentences you read to another source.”

FactoidLFactoidL’s formula is simple. it identifies the number of fact-checkable statements — which it calls “factoids” — in any given story, and then Anaxagoras cleans each “factoid” by removing all “unimportant words” and queries Wikipedia for matches to the remaining words or phrases. For any web page, users can see the number and list of “factoids” as well as an accuracy percentage for the page.

This process is currently defective — most likely because only statements that align with Wikipedia descriptions are identified as true or accurate. The 219 stories tested turned out an average of approximately 60 factoids and an average accuracy percentage of approximately 0.9 percent. Of these 219 stories, 154 were rated as 0 percent accurate, while 12 were rated as 5 percent accurate or higher and only one was rated as high as 10 percent accurate.

The story with the highest number of “factoids” — from YourNewsWire.com — registered 2,645 “factoids,” but many could be discounted as claims that were not factual. FactoidL has a tendency, for example, to mark the dateline, byline and headline of a story as “factoids.” It often counts opinion statements, as well.

Where NewsCracker is not yet ready for prime time, FactoidL has a long way to go. Very few news articles from reputable journalistic outlets are actually less than 10 percent accurate. The fact that FactoidL rated all stories tested by the Lab as less than 10 percent accurate implies that the extension is not just “hit-or-miss” with its algorithm; it is missing every time.

The code powering FactoidL is available on GitHub, and interested parties can provide feedback or even volunteer to contribute.

The future is bright

Any new technology is going to hit some bumps along the way, with bugs and breakdowns to be expected. These young developers are trying something really ambitious in a way that is both innovative and exciting. We admire the spirit of their extensions and hope to see them developed further.

Comments closed

Fact-checking triples over four years

The number of fact-checkers around the world has more than tripled over the past four years, increasing from 44 to 149 since the Duke Reporters’ Lab first began counting these projects in 2014 — a 239 percent increase. And many of those fact-checkers in 53 countries are also showing considerable staying power.

This is the fifth time the Reporters’ Lab has tallied up the organizations where reporters and researchers verify statements by public figures and organizations and keep tabs on other sources of misinformation, particularly social media. In each annual census, we have seen steady increases on almost every continent — and the past year was no different.

The 2018 global count is up by nearly a third (31 percent) over the 114 projects we included in last year’s census. While some of that year-over-year change comes because we discovered established fact-checking ventures that we hadn’t yet counted in our past surveys, we also added 21 fact-checking projects that launched since the start of 2017, including one — Tempo’s “Fakta atau Hoax” in Indonesia — that opened for business a month ago.

2018 fact-checking censusAnd that list of startups does not count one short-run fact-checking project — a TV series produced by public broadcaster NRK for Norway’s national election last year. That series is now among the 63 inactive fact-checkers we count on our regularly updated map, list and database. Faktisk, a Norwegian fact-checking partnership that several media companies launched in 2017, remains active.

Elections are often catalysts for political watchdog projects. In addition to the two Norwegian projects, national or regional voting helped spur new fact-checking efforts in Indonesia, South Korea, France, Germany and Chile.

Fact-Checkers By Continent
Africa:4
Asia: 22
Australia: 3
Europe : 52
North America: 53
South America: 15

Many of the fact-checkers we follow have shown remarkable longevity.

Based on the 143 projects whose launch dates we know for certain, 41 (29 percent) have been in business for more than five years. And a diverse group of six have already celebrated 10 years of nearly continuous operation — from 23-year-old Snopes.com, the grandparent of hoax-busting, to locally focused “Reality Checks” from  WISC-TV (News 3) in Madison, Wisconsin, which started fact-checking political statements in 2004. Some long-term projects have occasionally shuttered between election cycles before resuming their work. And some overcame significant funding gaps to come back from the dead.

On average, fact-checking organizations have been around four years.
One change we have noted over the past few years is some shifting in the kind of organizations that are involved in fact-checking and the way they do business. The U.S. fact-checker PolitiFact, for instance, began as an independent project of the for-profit Tampa Bay Times in 2007. With its recently announced move to Poynter Institute, a media training center in St. Petersburg, Florida, that is also the Times’ owner, PolitiFact now has nonprofit status and is no longer directly affiliated with a larger news company.

That’s unusual move for a project in the U.S., where most fact-checkers (41 of 47, or 87 percent) are directly affiliated with newspapers, television networks and other established news outlets. The opposite is the case outside the U.S., where a little more than half of the fact-checkers are directly affiliated (54 of 102, or 53 percent).

The non-media fact-checkers include projects that are affiliated with universities, think tanks and non-partisan watchdogs focused on government accountability. Others are independent, standalone fact-checkers, including a mix of nonprofit and commercial operations as well as a few that are primarily run by volunteers.

Fact-checkers, like other media outlets, are also seeking new ways to stay afloat — from individual donations and membership programs to syndication plans and contract research services. Facebook has enlisted fact-checkers in five countries to help with the social platform’s sometimes bumpy effort to identify and label false information that pollutes its News Feed. (Facebook also is a Reporter’s Lab funder, we should note.) And our Lab’s Google-supported Share the Facts project helped that company  elevate fact-checking on its news page and other platforms. That’s a development that creates larger audiences that are especially helpful to the big-media fact-checkers that depend heavily on digital ad revenue.

Growing Competition

The worldwide growth in fact-checking means more countries have multiple reporting teams keeping an ear out for claims that need their scrutiny.

Last year there were 11 countries with more than one active fact-checker. This year, we counted more than one fact-checker in 22 countries, and more than two in 11 countries.

Countries With More Than Two Fact-Checkers
United States: 47
Brazil: 8
France: 7
United Kingdom: 6
South Korea: 5
India: 4
Germany: 4
Ukraine: 4
Canada: 4
Italy: 3
Spain: 3

There’s also growing variety among the fact-checkers. Our database now includes several science fact-checkers, such as Climate Feedback at the University of California Merced’s Center for Climate Communication and Détecteur de Rumeurs from Agence Science-Presse in Montreal. Or there’s New York-based Gossip Cop, an entertainment news fact-checking site led since 2009 by a “reformed gossip columnist.” (Gossip Cop is also another example of a belated discovery that only appeared on our fact-checking radar in the past year.)

As the fact-checking community around the world has grown, so has the International Fact-Checking Network. Launched in 2015, it too is based at Poynter, the new nonprofit home of PolitiFact. The network has established a shared Code of Principles as well as a process for independent evaluators to verify its signatories’ compliance. So far, about a third of the fact-checkers counted in this census, 47 of 149, have been verified.

The IFCN also holds an annual conference for fact-checkers that is co-sponsored by the Reporters’ Lab. There is already a wait list of hundreds of people for this June’s gathering in Rome.

U.S. Fact-Checking

The United States still has far more fact-checkers than any other country, but growth in the U.S. was slower in 2017 than in the past. For the first time, we counted fewer fact-checkers in the United States (47) than there were in Europe (52).

While the U.S. count ticked up slightly from 43 a year ago, some of that increase came from the addition of newly added long-timers to our database — such as the Los Angeles Times, Newsweek magazine and the The Times-Union newspaper in Jacksonville, Florida. Another of those established additions was the first podcast in our database: “Science Vs.” But that was an import. “Science Vs.” began as a project at the Australian public broadcaster ABC in 2015 before it found its U.S. home a year later at Gimlet Media, a commercial podcasting company based in New York.

Among the new U.S. additions are two traditionally conservative media outlets: The Daily Caller (and its fact-checking offshoot Check Your Fact) and The Weekly Standard. To comply with the IFCN’s Code of Principles, both organizations have set up internal processes to insulate their fact-checkers from the reporting and commentary both publications are best known for.

Another new addition was the The Nevada Independent, a nonprofit news service that focuses on state politics. Of the 47 U.S. fact-checkers, 28 are regionally oriented, including the 11 state affiliates that partner with PolitiFact.

We originally expected the U.S. number would drop in a year between major elections, as we wrote in December, so the small uptick was a surprise. With this year’s upcoming midterm elections, we expect to see even more fact-checking in the U.S. in 2018.

The Reporters’ Lab is a project of the DeWitt Wallace Center for Media & Democracy at Duke University’s Sanford School for Public Policy. It is led by journalism professor Bill Adair, who was also PolitiFact’s founding editor. The Lab’s staff and student researchers identify and evaluate fact-checkers that specifically focus on the accuracy of statements by public figures and institutions in ways that are fair, nonpartisan and transparent. See this explainer about how we decide which fact-checkers to include in the database. In addition to studying the reach and impact of fact-checking, the Lab is home to the Tech & Check Cooperative, a multi-institutional project to develop automated reporting tools and applications that help fact-checkers spread their work to larger audiences more quickly.

Comments closed

Pop-up fact-checking app Truth Goggles aims to challenge readers’ biases

Dan Schultz, a technologist building a new fact-checking app for the Reporters’ Lab, says the app should be like a drinking buddy.

“You can have a friend who you fundamentally disagree with on a lot of things, but are able to have a conversation,” Schultz says. “You’re not thinking of the other person as a spiteful jerk who’s trying to manipulate you.”

Truth Goggles
(L-R) Dan Schultz, with Bad Idea Factory’s Ted Han, Carolyn Rupar-Han and Lou Huang, who are working with Schultz to create Truth Goggles. Photo courtesy of Dan Schultz.

Schultz, 31, is using that approach to develop a new version of Truth Goggles, an app he first built eight years ago at the MIT Media Lab, for the Duke Tech & Check Cooperative. His goal is to get to know users and find the most effective way to show them fact-checks. While other Tech & Check apps take a traditional approach by providing Truth-O-Meter ratings or Pinocchios to all users, Schultz plans to experiment with customized formats. He hopes that personalizing the interface will attract new audiences who are put off by fact-checkers’ rating systems.

Truth Goggles is a browser plugin that automatically scans a page for content that users might want fact-checked. Schultz hopes that this unique “credibility layer” will be like a gentle nudge to get people to consider fact-checks.

“The goal is to help people think more carefully and ideally walk away with a more accurate worldview from their informational experiences,” he says.

As a graduate student at the Media Lab, Schultz examined how people interact with media. His 150-page thesis paper concluded that when people are consuming information, they are protecting their identities.

Schultz learned that a range of biases make people less likely to change their minds when exposed to new information. Most people simply are unaware of how to consume online content responsibly, he says.

He then set out to use technology to short-circuit biased behavior and help people critically engage with media. The first prototype of Truth Goggles used fact-checks from PolitiFact as a data source to screen questionable claims.

“The world will fall apart if we don’t improve the way information is consumed through technology.”

Schultz recently partnered with the Reporters’ Lab to resume working on Truth Goggles. This time, Truth Goggles will be integrated with Share the Facts, so it can access all fact-checking articles formatted using the ClaimReview schema.

Schultz also is exploring creative ways to present the information to users. He says the interface must be effective in impeding biases and enjoyable for people to use. As a graduate student, one of Schultz’s initial ideas was to highlight verified claims in green and falsehoods in red. But he quickly realized this solution was not nuanced enough.

“I don’t want people to believe something’s true because it’s green,” he says.

The new version of Truth Goggles will use information about users’ biases to craft messages that won’t trigger their defenses. But Schultz doesn’t know exactly what this will look like yet.

“Can we use interfaces to have a reader challenge their beliefs in ways that just a blunt presentation of information wouldn’t?” Schultz says. “If the medium is the message, how can we shape the way that message is received?”

Born in Cheltenham, Pennsylvania, Schultz studied information systems, computer science and math at Carnegie Mellon University. As a sophomore, he won the Knight News Challenge, which provides grants for “breakthrough ideas in news and information.”

The News Challenge put him “on the path toward eventually applying to the Media Lab and really digging in,” he says.

After graduating from MIT, Schultz worked as a Knight-Mozilla Fellow at the Boston Globe and then joined the Internet Archive, where his title is senior creative technologist. He continues to develop side projects such as Truth Goggles through the Bad Idea Factory, a company with a tongue-in-cheek name that he started with friends. He says the company’s goal is “to make people ‘thinking face’ emoji” by encouraging its technologists to try out creative ideas. With Truth Goggles, he hopes to get people who may not already consume fact-checking content to challenge their own biases.

“The world will fall apart if we don’t improve the way information is consumed through technology,” Schultz says. “It’s sort of like the future of the universe as we know it depends on solving some of these problems.”

Comments closed

What we learned during our experiment with live fact-checking

Except for the moment when we almost published an article about comedian Kevin Hart’s plans for his wedding anniversary, the first test of FactStream, our live fact-checking app, went remarkably smoothly.

FactStream is the first in a series of apps we’re building as part of our Tech & Check Cooperative. We conducted a beta test during Tuesday’s State of the Union address that provided instant analysis from FactCheck.org, PolitiFact and Glenn Kessler, the Washington Post Fact Checker.

Overall, the app functioned quite well. Our users got 32 fact-checks during the speech and the Democratic response. Some were links to previously published checks while others were “quick takes” that briefly explained the relative accuracy of Trump’s claim.

FactStreamWhen President Trump said “we enacted the biggest tax cuts and reforms in American history,” users got nearly instant assessments from FactCheck and PolitiFact.

“It is not the biggest tax cut,” said the quick take from FactCheck.org. “It is the 8th largest cut since 1918 as a percentage of gross domestic product and the 4th largest in inflation-adjusted dollars.”

PolitiFact’s post showed a “False” Truth-O-Meter and linked to an October fact-check of a nearly identical claim by Trump. Users of the app could click through to read the October check.

Many of the checks appeared on FactStream seconds after Trump made a statement. That was possible because fact-checkers had an advance copy of the speech and could compose their checks ahead of time.

We had two technical glitches – and unfortunately both affected Glenn. One was a mismatch of the URLs for published Washington Post fact-checks that were in our database, which made it difficult for him to post links to his previous work. We understand the problem and will fix it.

The other glitch was bizarre. Last year we had a hiccup in our Share the Facts database that affected only a handful of our fact-checks. But during Tuesday’s speech we happened to hit one when Glenn got an inadvertent match with an article from the Hollywood rumor site Gossip Cop, another Share the Facts partner. So when he entered the correct URL for his own article about Trump’s tax cut, a fact-check showed up on his screen that said “Kevin Hart and Eniko Parrish’s anniversary plans were made up to exploit the rumors he cheated.”

Oops!

Fortunately Glenn noticed the problem and didn’t publish. (Needless to say, we’re fixing that bug, too.)

FactStreamThis version of FactStream is the first of several we’ll be building for mobile devices and televisions. This one relies on the fact-checkers to listen for claims and then write short updates or post links to previous work. We plan to develop future versions that will be automated with voice detection and high-speed matching to previous checks.

We had about 3,100 people open FactStream over the course of the evening. At the high point we had 1,035 concurrently connected users.

Our team had finished our bug testing and submitted a final version to Apple less than 48 hours before the speech, so we were nervous about the possibility of big crashes. But we watched our dashboard, which monitored the app like a patient in the ICU, and saw that it performed well.

Our goal for our State of the Union test was simple. We wanted to let fact-checkers compose their own checks and see how users liked the app. We invited users to fill out a short form or email us with their feedback.

The response was quite positive. “I loved it — it was timely in getting ‘facts’ out, easy to use, and informative!” Also: “I loved FactStream! I was impressed by how many fact-checks appeared and that all of them were relevant.”

We also got some helpful complaints and suggestions:

Was the app powered by people or an algorithm? We didn’t tell our users who was choosing the claims and writing the “quick takes,” so some people mistakenly thought it was fully automated. We’ll probably add an “About” page in the next version.

More detail for Quick Takes. Users liked when fact-checkers displayed a rating or conclusion on our main “stream” page, which happened when they had a link to a previous article. But when the fact-checkers chose instead to write a quick take, we showed nothing on the stream page except the quote being checked. Several people said they’d like some indication about whether the statement was true, false or somewhere in between. So we’ll explore putting a short headline or some other signal about what the quick take says.

Better notifications. Several users said they would like the option of getting notifications of new fact-checks when they weren’t using the app or had navigated to a different app or website. We’re going to explore how we might do that, recognizing that some people may not want 32 notifications for a single speech.

An indication the app is still live. There were lulls in the speech when there were no factual claims, so the fact-checkers didn’t have anything new to put on the app. But that left some users wondering if the app was still working. We’ll explore ways we can indicate that the app is functioning properly.

Comments closed