Press "Enter" to skip to content

Tag: FactCheck.org

New Share the Facts widget helps facts – rather than falsehoods – go viral

The Duke Reporters’ Lab is introducing Share the Facts, a widget that provides a new way for readers to share fact-check articles and spread them virally across the Internet.

The compact Share the Facts box summarizes the claim being checked and the fact-checker’s conclusion in a mobile-friendly format. The widgets have a consistent look but can be customized with the fact-checkers’ logos and ratings such as Pinocchios or the Truth-O-Meter. The standardization allows readers to recognize fact-checking whenever they come across it on the web and to post Share the Facts on social media and by embedding the boxes in articles and blog posts.

The widget summarizes fact-checks and allows readers to click to the original article.
The widget summarizes fact-checks and allows readers to click to the original article.

Fact-checkers can create Share the Facts boxes using a simple template developed by the Reporters’ Lab. The form generates the HTML of the box that can be pasted into content management systems or embedded in the same way as Tweets. Share the Facts boxes are also fully machine-readable, enabling new ways of assembling automated collections of fact-check findings from across the Internet. For example, someone could set up a page that compiles Share the Fact boxes from a single event or a particular candidate.

Share the Facts will be helpful to columnists and bloggers because they’ll be able to compile and display several boxes for a debate or a candidate the same way they embed tweets.

Share the Facts was developed by The Reporters’ Lab and Jigsaw, a technology incubator within Alphabet, the parent company of Google.

The widgets are customized with the logo of the fact-checking site.
The widgets are customized with the logo of the fact-checking site.

The widget has been tested in the past few weeks by The Washington Post, PolitiFact and FactCheck.org. The Reporters’ Lab has been incorporating feedback from those sites and will be making the widget available to other fact-checking sites this spring and summer.

“We are excited to participate in the Share the Facts project,” said Eugene Kiely, director of FactCheck.org. “It gives voters the ability to more easily share fact-checking stories and find fact-checking stories.”

Glenn Kessler, the editor and chief writer of The Washington Post’s Fact Checker, said it “will be a terrific tool for readers to share the results of our fact-checking. In this exciting, fact-challenged campaign year, I expect it will expand the reach and impact of our work.”

For articles from FactCheck.org and other sites that don't use rating systems, the widget can include a short text explaining the conclusion.
For articles from FactCheck.org and other sites that don’t use rating systems, the widget can include a short text explaining the conclusion.

Said Aaron Sharockman, the executive director of PolitiFact: “Share the Facts is part of the antidote to the massive spread of misinformation. We all know how quickly falsehoods can spread on the Internet. Now readers have a simple tool to fight back with facts.”

For more information,  go to www.sharethefacts.org

Comments closed

SciCheck puts political claims under a microscope

Developmental psychologists say that most nine-month-olds are just learning how to roll over and utter familiar sounds.* But nine months in, the SciCheck fact-checking channel is standing upright and — to the delight of its proud parents — loudly challenging the politicians it catches toying with science.

SciCheck is part of FactCheck.org, a 12-year-old journalism project run by the Annenberg Public Policy Center at the University of Pennsylvania. The new science-oriented feature launched in January to expose “false and misleading scientific claims.” With a new Congress and a competitive presidential campaign, there’s been no shortage of material — from the impact of medical marijuana and genomic research to the environmental consequences of volcanoes and barbeques.

SciCheckThe initial funding for SciCheck came in the form of a $102,000 grant from the Stanton Foundation, a philanthropic organization created in the name of longtime CBS president Frank Stanton and his wife. Now, as part of a new $150,000 grant, the foundation will keep SciCheck in business through the upcoming election year.

Annenberg Director Kathleen Hall Jamieson has said she got the idea for SciCheck during the early stages of the 2012 presidential race, after hearing Republican candidate Michele Bachmann make false claims about the HPV vaccine. Four years later, when Republicans Rand Paul and Carly Fiorina suggested that vaccines pose unproven dangers, SciCheck was there to call them out.

The science section’s sole fact-checker is Dave Levitan. He was a freelance writer for nearly ten years after earning a masters in journalism from New York University, where he also received a certificate in science, health and environmental reporting. SciCheck is Levitan’s first job as a staff member. “The biggest transition was just going into an office everyday,” he said. “I’d been used to working from home.”

While other big-time national fact-checkers — such as PolitiFact and the Washington Post Fact Checker — occasionally check scientific claims, Factcheck.org has “a dedicated area of a site and a dedicated person” to focus on those topics, Levitan explained. “I don’t cover much besides science.”

SciCheck closely follows the usual FactCheck.org format for challenging a politicians’ statement. Unlike the fact-checkers that examine both true and false claims, SciCheck only reviews statements it suspects are false or misleading.

Levitan translates scientific studies and evidence into more accessible terms to confute the politicians — a process that can send their spokespeople and spin doctors toggling from Politico to Scientific American.

Scientific topics can draw in a different kind of reader than a typical political fact-check. “It may be a different, specific group that’s interested in science, but SciCheck really is just part of the site, focused on scientific topics,” Levitan said. “I write a little bit differently than someone writing about jobs or immigration, but that’s just the nature of the topic, not necessarily a focus on the audience.”

Despite SciCheck’s nonpartisan status, a majority of its fact-checks have been about Republican claims. Of the 35 stories published so far, only five focused on Democrats’ false or misleading statements, while 25 concentrated on Republicans’. (The remaining posts included a fact-check that covered statements from both parties and video recaps of previous stories).

Levitan’s boss and FactCheck.org’s director, Eugene Kiely, said he is not concerned about the disparity. “Generally speaking, we don’t keep score,” Kiely said. “Our job is to give voters the facts and counter partisan misinformation. We apply the exact same standards of accuracy to claims made by each side, and let the chips fall where they may.”

Levitan attributed the disparity to the Republican presidential candidates’ domination of media coverage. “I think a big part of it is that there are more Republican candidates, so they do a lot of campaign events,” he noted. That means “there are just more opportunities” to get things wrong.

“We are nonpartisan and will cover absolutely anything either party says,” Levitan said. “So if they get science wrong, I’ll cover it.”

Twelve SciCheck stories to date focused specifically on climate change — nine on Republicans’ claims and three on Democrats’. On that issue, Levitan said, “It does seem like there’s more skepticism among politicians than there is even among their constituents.”

The truth is, climate change — like many scientific questions SciCheck covers — is a partisan issue. A 2015 study conducted by the Pew Research Center, for instance, found that 71 percent of Americans who lean Democratic believe global warming is due to human activity, compared to 27 percent of those who lean Republicans.

For fact-checkers, separating the science from the politics and putting claims in context is important. “Sometimes what we write about isn’t necessarily that you got a number wrong but is that you’re spinning a given fact to fit your narrative,” Levitan said.

To broaden its audience, FactCheck.org is seeking new distribution outlets for SciCheck’s works, with stories already picked up by Discover Magazine, EcoWatch.org and the Consortium of Social Science Associations. “I expect that we will expand that further during the 2016 presidential campaign, since we typically get more traffic and attention in presidential years,” said FactCheck.org’s Kiely.

Answers in science aren’t always clear cut. But with SciCheck coming of age this election cycle, voters may have a better guide to help them sort science fact from science fiction.

______________________

* We checked.

Comments closed

Fact-checkers spin-up for presidential debates

Fact-checking season is underway, and some new players are getting into the act.

FiveThirtyEight, NPR, Vox and Politico unveiled new fact-checking features for the presidential debates that began last month. Others revived their truth-seeking teams, joining usual suspects such as FactCheck.org, the Washington Post and PolitiFact in their perennial efforts to verify what politicians are saying.

The fact-checkers often focus on the same claims, but coverage from last week’s Republican debates in California showed the varying ways they use to explain their findings. In its coverage, CNN rated statements on a scale similar to PolitiFact’s Truth-O-Meter, while the New York Times and NPR chose to work without a grading system similar to the FactCheck.org model.

CNN fact-checking
CNN said its Fact-Checking Team “picked the juiciest statements, analyzed them, consulted issue experts and then rated them.”

As in last month’s first debates, hosted by Fox News, the Post set aside its four-Pinocchio scale, offering a single scrolling summary of multiple fact-checks before following up additional posts in its usual style. Politico’s Wrongometer, CNN and NPR used similar models. Others posted individual items about specific claims or packaged a number of individually linkable fact-checks together as a combined reading experience. There also were efforts to do some real-time fact-checking while the debates were underway.

Here’s a roundup from last week’s two-round Republican debate, which included a primetime showdown with 11 candidates and an earlier session with four others:

CNN: The debate host’s “Fact-Checking Team” checked 16 claims and awarded them rulings from “True” to “It’s Complicated” to “False.” The “It’s Complicated” rating was awarded to Kentucky Sen. Rand Paul, who said Saudi Arabia was not accepting any Syrian refugees, and Texas Sen. Ted Cruz, for statements he made regarding the Iran nuclear agreement.

NPR: The radio network fact-checked four claims as part of its new “Break it Down” segment — all involving statements by or in response to Donald Trump. The claims ranged from the real estate developer’s lobbying for casinos in Florida to the safety of vaccination. NPR didn’t rate the claims on a scale and instead explained the validity of comments.

New York Times: The Times examined 11 claims, including topics from Planned Parenthood to immigration policy. Like NPR, the Times did not use a rating system. They did, however, post their fact-checks during the debate as part of their live coverage. Many of their checks focused on Trump and Ben Carson, the retired pediatric neurosurgeon whose outsider status had helped him climb up in the polls after the August debate on Fox News.

Politico: The Agenda, Politico’s policy channel, applied its Wrongometer to 12 claims, focusing on topics such as Trump’s bankruptcy and President Obama’s nuclear agreement with Iran. The group also scrutinized former Hewlett-Packard CEO Carly Fiorina’s remarks about Syria and a much-repeated Columbine myth. Despite its Wrongometer header, Politico’s fact-checkers do not use a rating system.

Vox: Rather than the relatively short, just-the-facts summations most other fact-checkers posted, Vox penned full-length commentaries on a handful of claims. Two featured statements by Fiorina (one about Planned Parenthood, linked here, and another on her time at HP), and one checked the candidates’ views on vaccinations. No rating was used.

AP: The news service fact-checked five claims, including statements from Fiorina on Planned Parenthood and the effects of Trump’s plan for an economic “uncoupling” from China. The AP did not use a system to rate these claims.

FiveThirtyEight: The site did its fact-checking in its debate live blog. FiveThirtyEight’s staff did not use any sort of rating system in its real-time reviews of the candidates’ statements, such as Trump’s claim about Fiorina’s track record as CEO of HP and President Obama’s likability overseas.

FactCheck.org: The fact-checkers based at the University of Pennsylvania’s Annenberg Public Policy Center reviewed 14 claims from the debates. FactCheck.org did not rate the claims, which included former Arkansas Gov. Mike Huckabee’s statements about Hillary Clinton’s email scandal to Trump’s comments on Wisconsin’s budget under Gov. Scott Walker.

PolitiFact: Run by the Tampa Bay Times, Washington-based PolitiFact fact-checked 15 debate claims so far, and awarded them rulings from “Pants on Fire” to “True.” The “Pants on Fire” rating went to Carson, who said that many pediatricians recognize the potential harm from too many vaccines. They also awarded a “True” rating to Fiorina’s statement regarding the potency of marijuana. While the debate was underway, the PolitiFact staff tapped their archive of previous calls to live blog the event.

The Washington Post Fact Checker: The Post’s two-person fact-checking team reviewed 18 claims in a roundup that included Trump’s denial that he’d ever gone bankrupt and New Jersey’s Gov. Chris Christie’s story about being named U.S. attorney by President George W. Bush on Sept. 10, 2001. The fact-checkers also posted versions of those items in the Post’s debate-night live blog. Following its usual practice for debates, the Post did not use its Pinocchio system to rate these claims. But since the debate, the Post added more Pinocchio-based fact-checks, including items on Fiorina’s criticisms of veterans’ health care (two Pinocchios) and Rubio’s comments on North Korea’s nuclear capabilities (one Pinocchio). Notably both of those items were suggested by Post readers.

Comments closed

Study explores new questions about quality of global fact-checking

How long should fact-checks be? How should they attribute their sources — with links or a detailed list? Should they provide a thorough account of a fact-checker’s work or distill it into a short summary?

Those are just a few of the areas explored in a fascinating new study by Lucas Graves, a journalism professor at the University of Wisconsin. He presented a summary of his research last month at the 2015 Global Fact-Checking Summit in London.

Lucas Graves
Lucas Graves

The pilot project represents the first in-depth qualitative analysis of global fact-checking. It was funded by the Omidyar Network as part of its grant to the Poynter Institute to create a new fact-checking organization. The study, done in conjunction with the Reporters’ Lab, lays the groundwork for a more extensive analysis of additional sites in the future.

The findings reveal that fact-checking is still a new form of journalism with few established customs or practices. Some fact-checkers write long articles with lots of quotes to back up their work. Others distill their findings into short articles without any quotes. Graves did not take a position on which approach is best, but his research gives fact-checkers some valuable data to begin discussions about how to improve their journalism.

Graves and three research assistants examined 10 fact-checking articles from each of six different sites: Africa Check, Full Fact in the United Kingdom, FactChecker.in in India, PolitiFact in the United States, El Sabueso in Mexico and UYCheck in Uruguay. The sites were chosen to reflect a wide range of global fact-checking, as this table shows:

Screen Shot 2015-08-11 at 3.26.38 PM
Click on the chart for more detail, then click browser “back” arrow to return to article.

Graves and his researchers found a surprising range in the length of the fact-checking articles. UYCheck from Uruguay had the longest articles, with an average word count of 1,148, followed by Africa Check at 1,009 and PolitiFact at 983.

The shortest were from Full Fact, which averaged just 354 words. They reflected a very different approach by the British team. Rather than lay out the factual claims and back them up with extensive quotes the way Screen Shot 2015-08-11 at 3.37.21 PMmost other sites do, the Full Fact approach is to distill them down to summaries.

Graves also found a wide range of data visualization in the articles sampled for each site. For example, Africa Check had three data visualizations in its 10 articles, while there were 11 in the Indian site FactChecker.in.

Graves found some sites used lots of data visualizations; others used relatively few.
Graves found some sites used lots of data visualizations; others used relatively few.

The Latin American sites UYCheck and El Sabueso used the most infographics, while the other sites relied more on charts and tables.

Graves also found a wide range in the use of web links and quotes. Africa Check averaged the highest total of web links and quotes per story (18), followed by 12 for PolitiFact, while UYCheck and El Sabueso had the fewest (8 and 5, respectively). Full Fact had no quotes in the 10 articles Graves examined but used an average of 9 links per article.

Graves and his researchers also examined how fact-checkers use links and quotes — whether they were used to provide political context about the claim being checked, to explain the subject being analyzed or to provide evidence about whether the claim was accurate. They found some sites, such as Africa Check and PolitiFact, used links more to provide context for the claim, while UYCheck and El Sabueso used them more for evidence in supporting a conclusion.

The analysis of quotes yielded some interesting results. PolitiFact used the most in the 10 articles — 38 quotes — with its largest share from evidentiary uses. Full Fact used the fewest (zero), followed by UYCheck (23) and El Sabueso (26).

The study also examined what Graves called “synthetic” sources — the different authoritative sources used to explain an issue and decide the accuracy of a claim. This part of the analysis distilled a final list of institutional sources for each fact-check, regardless of whether sources were directly quoted or linked to. AfricaCheck led the list with almost nine different authoritative sources considered on average, more than twice as many as FactChecker.in and UYCheck. Full Fact, UYCheck, and El Sabueso relied mainly on government agencies and data, while PolitiFact and Africa Check drew heavily on NGOs and academic experts in addition to official data.

The study raises some important questions for fact-checkers discuss. Are we writing are fact-checks too long? Too short?

Are we using enough data visualizations to help readers? Should we take the time to create more infographics instead of simple charts and tables?

What do we need to do to give our fact-checks authority? Are links sufficient? Or should we also include quotes from experts?

Over the next few months, we’ll have plenty to discuss.

Comments closed

Fact-Checking Census finds continued growth around the world

Fact-checking keeps growing around the world, with new sites in countries such as Turkey, Uruguay and South Korea.

The 2015 Fact-Checking Census from the Duke Reporters’ Lab found 89 that have been active in the past few years and 64 that are active today. That’s up from 59 total/44 active when we did our last count in May 2014. (We include inactive sites in our total count because sites come and go with election cycles. Some news organizations and journalism NGOs only fact-check during election years.)

Many of the additional sites have started in the last seven months, including UYCheck in Uruguay and Dogruluk Payi in Turkey. Others are sites that we didn’t find when we did our first count.

You can see the complete list on the fact-checking page of the Reporters’ Lab website, where you can browse by continent and country.

As with our last tally, the largest concentrations of fact-checking are in Europe and North America. We found 38 sites in Europe (including 27 active), 30 in North America (22 active) and seven in South America (five active). There are two new sites in South Korea.

The Truth or False Poll in South Korea enlists readers to help with fact-checking.
The Truth or False Poll in South Korea enlists readers to help with fact-checking.

The percentage of sites that use ratings continues to grow, up from about 70 percent in last year’s count to 80 percent today. Many rating systems use a true to false scale while others have devised more creative names. For example, ratings for the European site FactCheckEU include “Rather Daft” and “Insane Whopper.” Canada’s Baloney Meter rates statements from “No Baloney” to “Full of Baloney.”

We found that 56 of the 89 sites are affiliated with news organizations such as newspapers and television networks. The other 33 are sites that are dedicated to fact-checking such as FactCheck.org in the United States and Full Fact in Great Britain.

Almost one-third of the sites (29 of the 89) track the campaign promises of elected officials. Some, such as the Rouhani Meter for Iran’s President Hassan Rouhani, only track campaign promises. Others, such as PolitiFact in the United States, do promise-tracking in addition to fact-checking.

For more information about the Reporters’ Lab database, contact Bill Adair at  bill.adair@duke.edu

Comments closed