Press "Enter" to skip to content

Tag: Rating systems

Misinformation spreads, but fact-checking has leveled off

While much of the world’s news media has struggled to find solid footing in the digital age, the number of fact-checking outlets reliably rocketed upward for years — from a mere 11 sites in 2008 to 424 in 2022.

But the long strides of the past decade and a half have slowed to a more trudging pace, despite increasing concerns around the world about the impact of manipulated media, political lies and other forms of dangerous hoaxes and rumors.

In our 10th annual fact-checking census, the Duke Reporters’ Lab counts 417 fact-checkers that are active so far in 2023, verifying and debunking misinformation in more than 100 countries and 69 languages.

While the count of fact-checkers routinely fluctuates, the current number is roughly the same as it was in 2022 and 2021.

In more blunt terms: Fact-checking’s growth seems to have leveled off.

2023 fact-checkers by year

Since 2018, the number of fact-checking sites has grown by 47%. While that’s an increase of 135, it is far slower than the preceding five years, when the numbers grew more than two and a half times, or the six-fold increase over the five years before that.

There also are important regional patterns. With lingering public health issues, climate disasters, and Russia’s ongoing war with Ukraine, factual information is still hard to come by in important corners of the world.

Before 2020, there was a significant growth spurt among fact-checking projects in Africa, Asia, Europe and South America. At the same time, North American fact-checking began to slow. Since then, growth in the fact-checking movement has plateaued in most of the world.

Fact-checkers by continent

The Long Haul

One good sign for fact-checking is the sustainability and longevity of many key players in the field. Almost half of the fact-checking organizations in the Reporters’ Lab count have been active for five years or more. And roughly 50 of them have been active for 10 years or more.

The average lifespan of an active fact-checking site is less than six years. The average lifespan of the 139 fact-checkers that are now inactive was not quite three years.

But the baby boom has ended. Since 2019, when a bumper crop of 83 fact-checkers went online, the number of new sites each year has steadily declined. The Reporters’ Lab count for 2022 is at 20, plus three additions in 2023 as of this June. That reduced the rate of growth from three years earlier by 72%.

The number of fact-checkers that closed down in that same period also declined, but not as dramatically. That means the net count of new and departing sites has gone from 66 in 2019 to 11 in 2022, plus one addition so far in 2023.

Net New Fact-checkers
Note: The Reporters’ Lab updates our counts regularly as we identify and add new sites to our fact-checking database. However, the differences over time have been small. The average yearly change between our 2022 and 2023 census counts is 2.

Fact-checking Starts, Stops and NetThe Downshift

As was the case for much of the world, the pandemic period certainly contributed to the slower growth. But another reason is the widespread adoption of fact-checking by journalists and other researchers from nonpartisan think tanks and good-government groups in recent years. That has broadened the number of people doing fact-checking but created less need for news organizations dedicated to the unique form of journalism.

With teams working in 108 countries, just over half of the nations represented in the United Nations have at least one organization that already produces fact-checks for digital media, newspapers, TV reports or radio. So in some countries, the audience for fact-checks might be a bit saturated. As of June, there are 71 countries that have more than one fact-checker.

Number of Fact-checkers Per Country

Another reason for the slower pace is that launching new fact-checking projects is challenging — especially in countries with repressive governments, limited press freedom and safety concerns for journalists …. In other words, places where fact-checking is most needed.

The 2023 World Press Freedom Index rates press conditions as “very serious” in 31 countries. And almost half of those countries (15 of 31) do not have any fact-checking sites. They are Bahrain, Djibouti, Eritrea, Honduras, Kuwait, Laos, Nicaragua, North Korea, Oman, Russia, Tajikistan, Turkmenistan, Vietnam and Yemen. The Index also includes Palestine, the status of which is contested.

Remarkably, there are 62 fact-checking services in the 16 other countries on the “very serious” list. And in eight of those countries, there is more than one site. India, which ranks 161 out of 180 in the World Press Freedom Index, is home to half of those 62 sites. The other countries with more than one fact-checking organization are Bangladesh, China, Venezuela, Turkey, Pakistan, Egypt and Myanmar.

In some cases, fact-checkers from those countries must do their work from other parts of the world or hide their identities to protect themselves or their families. That typically means those journalists and researchers must work anonymously or report on a country as expats from somewhere else. Sometimes both.

At least three fact-checking teams from the Middle East take those precautions: Fact-Nameh (“The Book of Facts”), which reports on Iran from Canada; Tech 4 Peace, an Iraq-focused site that also has members who work in Canada; and Syria’s Verify-Sy, whose staff includes people who operate from Turkey and Europe.

Two other examples are elTOQUE DeFacto, a project of a Cuban news website that is legally registered in Poland; and the fact-checkers at the Belarusian Investigative Center, which is based in the Czech Republic.

In other cases, existing fact-checking organizations have also established separate operations in difficult places. The Indian sites BOOM and Fact Crescendo​ have set-up fact-checking services in Bangladesh, while the French news agency Agence France-Presse (AFP) has fact-checkers that report on misinformation from Hong Kong, India and Myanmar, among others.

There still are places where fact-checking is growing, and much of that has to do with organizations that have multiple outlets and bureaus — such as AFP, as noted above. The French international news service has about 50 active sites aimed at audiences in various countries and various languages.

India-based Fact Crescendo​ launched two new channels in 2022 — one for Thailand and another focused broadly on climate issues. Along with two other outlets the previous year, Fact Crescendo now has a total of eight sites.

The 2022 midterm elections in the United States added six new local fact-checking outlets to our global tally, all at the state level. Three of the new fact-checkers were the Arizona Center for Investigative Reporting, The Nevada Independent and Wisconsin Watch, all of whom used a platform called Gigafact that helped generate quick-hit “Fact Briefs” for their audiences. But the Arizona Center is no longer participating. (For more about the 2022 U.S. elections see “From Fact Deserts to Fact Streams” — a March 2023 report from the Reporters’ Lab.)

About the Reporters’ Lab and Its Census

The Duke Reporters’ Lab began tracking the international fact-checking community in 2014, when director Bill Adair organized a group of about 50 people who gathered in London for what became the first Global Fact meeting. Subsequent Global Facts led to the creation of the International Fact-Checking Network and its Code of Principles.

The Reporters’ Lab and the IFCN use similar criteria to keep track of fact-checkers, but use somewhat different methods and metrics. Here’s how we decide which fact-checkers to include in the Reporters’ Lab database and census reports. If you have questions, updates or additions, please contact Mark Stencel, Erica Ryan or Joel Luther.

* * *

Related links: Previous fact-checking census reports

April 2014: https://reporterslab.org/duke-study-finds-fact-checking-growing-around-the-world/

January 2015: https://reporterslab.org/fact-checking-census-finds-growth-around-world/

February 2016: https://reporterslab.org/global-fact-checking-up-50-percent/

February 2017: https://reporterslab.org/international-fact-checking-gains-ground/

February 2018: https://reporterslab.org/fact-checking-triples-over-four-years/

June 2019: https://reporterslab.org/number-of-fact-checking-outlets-surges-to-188-in-more-than-60-countries/

June 2020: https://reporterslab.org/annual-census-finds-nearly-300-fact-checking-projects-around-the-world/

June 2021: https://reporterslab.org/fact-checking-census-shows-slower-growth/

June 2022: https://reporterslab.org/fact-checkers-extend-their-global-reach-with-391-outlets-but-growth-has-slowed/

Comments closed

PolitiFact at 15: Lessons about innovation, the Truth-O-Meter and pirates

Fifteen years ago, I worked with a small group of reporters and editors at the St. Petersburg Times (now the Tampa Bay Times) to start something bold: a fact-checking website that called out politicians for being liars. 

That concept was too gutsy for the Times political editor. Sure, he liked the idea, he said at a meeting of editors, “but I want nothing to do with it.”

That was my first lesson that PolitiFact was going to disrupt the status quo, especially for political journalists. Back then, most of them were timid about calling out lies by politicians. They were afraid fact-checking would displease the elected officials they covered. I understood his reluctance because I had been a political reporter for many years. But after watching the lying grow in the early days of the internet, I felt it was time for us to change our approach.

Today, some political reporters have developed more courage, but many still won’t call out the falsehoods they hear. PolitiFact does. So I’m proud it’s going strong.

It’s now owned by the Poynter Institute, and it has evolved with the times. As a proud parent, allow me to brag: PolitiFact has published more than 22,000 fact-checks, won a Pulitzer Prize and sparked a global movement for political fact-checking. Pretty good for a journalism org that’s not even old enough to drive. 

On PolitiFact’s 15th birthday, I thought it would be useful to share the lesson about disruption and a few others from my unusual journey through American political journalism. Among them:

Gimmicks are good

The Truth-O-Meter — loved by many and loathed by some — is at times derided as a mere gimmick. I used to bristle at that word. Now I’m fine with it. 

My friend Brooks Jackson, the co-founder of FactCheck.org, often teased me about the meter. That teasing culminated in a farewell essay that criticized “inflexible rating systems” like our meter because they were too subjective. 

I agree with Brooks to an extent. Summarizing a complex fact-check to a rating such as Half True is subjective. But it’s a tremendous service to readers who may not want to read a 1,000-word fact-check article. What’s more, while it relies on the judgment of the journalists, it’s not as subjective as some people think. Each fact-check is thoroughly researched and documented, and PolitiFact has a detailed methodology for its ratings. 

Yes, the Truth-O-Meter is a gimmick! (I once got recognized in an airport by a lady who had seen me on TV and said, “You’re the Truth-O-Meter guy!”) But its ratings are the product of PolitiFact’s thorough and transparent journalism. It’s a gimmick with substance. 

Empower the pirates

I was the founding editor, the guy with the initial ideas and some terrible sketches (my first design had an ugly rendering of the meter with “Kinda True” scribbled above). But the editors at the Tampa Bay Times believed in the idea enough to assign other staffers who had actual talent, including a spirited data journalist named Matt Waite and a marvelous designer named Martin Frobisher.

Times Executive Editor Neil Brown, now president of Poynter, gave us freedom. He cut me loose from my duties as Washington bureau chief so I could write sample fact-checks. Waite and Frobisher were allowed to build a website outside the infrastructure of the Times website so we had a fresh look and more flexibility to grow.

We were like a band of pirates, empowered to be creative. We were free of the gravitational pull of the Times, and not bound by its rules and conventions. That gave us a powerful spirit that infused everything we did. 

Design is as important as content

We created PolitiFact at a time when political journalism, even on the web, was just words or pictures. But we spent as much energy on the design as on the journalism.

The PolitiFact home page from August 2007 had a simple design. Source: Wayback Machine – Internet Archive

It was 2007 and we were an American newspaper, so our team didn’t have a lot of experience or resources. But we realized that we could use the design to help explain our unique journalism. The main section of our homepage had a simple look — the face of the politician being checked (in the style of a campaign button), the statement the politician made and the Truth-O-Meter showing the rating they earned. We also created report cards so readers could see tallies that revealed how many True, Half True, False ratings, or whatever a politician had earned.

The design not only guided readers to our fact-check articles, it told the story as much as the words.

Twitter is not real life

My occasional bad days as editor always seemed more miserable because of Twitter. If we made an error or just got attacked by a partisan group, it showed up first and worst on Twitter.

I stewed over that. Twitter made it seem like the whole world hated us. The platform doesn’t foster a lot of nuance. You’re loved or hated. I got so caught up in it that when I left the office to go to lunch, I’d look around and have irrational thoughts about whether everyone had been reading the tweets and thought I was an idiot. 

But then when I went out with friends or talked with my family, I realized that real people don’t use Twitter. It’s largely a platform for journalists and the most passionate (read: angry) political operatives. My friends and family never saw the attacks on us, nor would they care if they did.

So when the talk on Twitter turned nasty (which was often), I would remind our staff: Twitter is not real life.

People hate referees

My initial sketch of the website was called “The Campaign Referee” because I thought it was a good metaphor for our work: We were calling the fouls in a rough and tumble sport. But Times editors vetoed that name… and I soon saw why.

People hate referees! On many days, it seemed PolitiFact made everyone mad! 

Bill Adair’s original sketch of “The Campaign Referee”

That phenomenon became clearer in 2013 when I stepped down as editor and came to Duke as a journalism professor. I became a Duke basketball fan and quickly noticed the shoddy work of the referees in the Atlantic Coast Conference. THEY ARE SO UNFAIR! Their calls always favor the University of North Carolina! What’s the deal? Did all the refs attend UNC.

Seek inspiration in unlikely places

When we expanded PolitiFact to the states (PolitiFact Wisconsin, PolitiFact Florida, etc.), our model was similar to fast-food franchises. We licensed our brand to local newspapers and TV and radio stations and let them do their own fact-checks using our Truth-O-Meter. 

That was risky. We were allowing other news organizations to use our name and methods. If they did shoddy work, it would damage our brand. But how could we protect ourselves?

I got inspiration from McDonald’s and Subway. I assigned one of our interns to write a report about how those companies ensured quality as they franchised. The answers: training sessions, manuals that clearly described how to consistently make the Big Macs and sandwiches, and quality control inspectors.

We followed each recommendation. I conducted detailed training sessions for the new fact-checkers in each town and then checked the quality by taking part in the editing and ratings for several weeks. 

I gave every fact-checker “The Truth-O-Meter Owner’s Manual,” a detailed guide to our journalism that reflected our lighthearted spirit (It began: “Congratulations on your purchase of a Truth-O-Meter! If operated and maintained properly, your Truth-O-Meter will give you years of enjoyment! But be careful because incorrect operation can cause an unsafe situation.”)

Adjust to complaints and dump the duds

We made adjustments. We had envisioned Pants on Fire as a joke rating (the first one was on a Joe Biden claim that President Bush was brain-dead), but readers liked the rating so much that we decided to use it on all claims that were ridiculously false. (There were a lot!)

In the meantime, though, we lost enthusiasm for the animated GIF for Pants on Fire. The burning Truth-O-Meter was amusing the first few times you saw it, but then … it was too much. Pants on Fire is now a static image.

As good as our design was, one section on the home page called the Attack File was too confusing. It showed the person making the attack as well as the individual being attacked. But readers didn’t grasp what we were doing. We 86’d the Attack File.

Initially, the rating between Half True and False was called Barely True, but many people didn’t understand it – and the National Republican Congressional Committee once distorted it. When the NRCC earned a Barely True, the group boasted in a news release, “POLITIFACT OHIO SAYS TRUE.” 

Um, no. We changed the rating to Mostly False. We also rated the NRCC’s news release. This time: Pants on Fire!

Comments closed

Fact-checkers extend their global reach with 391 outlets, but growth has slowed

The number of fact-checkers around the world doubled over the past six years, with nearly 400 teams of journalists and researchers taking on political lies, hoaxes and other forms of misinformation in 105 countries.

The Duke Reporters’ Lab annual fact-checking census counted 391 fact-checking projects that were active in 2021. Of those, 378 are operating now.

That’s up from a revised count of 186 active sites in 2016 – the year when the Brexit vote and the U.S. presidential election elevated global concerns about the spread of inaccurate information and rumors, especially in digital media. Misleading posts about ethnic conflicts, wars, the climate and the pandemic only amplified those worries in the years since.

Since last year’s census, we have added 51 sites to our global fact-checking map and database. In that same 12 months, another seven fact-checkers closed down.

While this vital journalism now appears in at least 69 languages on six continents, the pace of growth in the international fact-checking community has slowed over the past several years.

The largest growth was in 2019, when 77 new fact-checking sites and organizations made their debut. Based on our updated counts since then, the number was 58 in 2020 and 22 last year. 

New Fact Checkers by Year

New Fact Checkers by Year
Duke Reporters’ Lab

(Note: The adjusted number of 2021 launches may increase over time as the Reporters’ Lab identifies other fact-checkers we have not yet discovered.)

These numbers may be a worrisome trend, or they could mean that the growth of the past several years has saturated the market – or paused in the wake of the global pandemic. But we also expect our numbers for last year to eventually increase as we continue to identify other fact-checkers, as happens every year.

More than a third of the growth since 2019’s bumper crop came from existing fact-checking operations that added new outlets to expand their reach to new places and different audiences. That includes Agence France-Presse, the French international news service, which launched at least 17 new sites in that period. In Africa, Dubawa and PesaCheck opened nine new bureaus, while Asia’s Boom and Fact Crescendo opened five. In addition, Delfi and Pagella Politica in Europe and PolitiFact in North America each launched a new satellite, too.

Fact-checking has expanded quickly over the years in Latin America, but less so of late. Since 2019 we saw three launches in South America (one of which has folded) plus one more focused on Cuba. 

Active Fact-Checkers by Year

Active Fact-Checkers by Year
Duke Reporters’ Lab

The Reporters’ Lab is monitoring another trend: fact-checkers’ use of rating systems. These ratings are designed to succinctly summarize a fact-checker’s conclusions about political statements and other forms of potential misinformation. When we analyzed the use of these features in past reports, we found that about 80-90% of the fact-checkers we looked at relied on these meters and standardized labels to prominently convey their findings.

But that approach appears to be less common among newer fact-checkers. Our initial review of the fact-checkers that launched in 2020 found that less than half seemed to be using rating systems. And among the Class of 2021, only a third seemed to rely on predefined ratings. 

We also have seen established fact-checkers change their approach in handling ratings.

fact-checking meters
Examples of fact-checking meters from Público’s Prova dos Factos in Portugal, the Fact Investigation Platform’s Factometer in Armenia, OhmyFact from South Korea’s OhmyNews, and Nepal Fact Check from the Center for Media Research-Nepal.

The Norwegian fact-checking site Faktisk, for instance, launched in 2017 with a five-point, color-coded rating system that was similar to ones used by most of the fact-checkers we monitor: “helt sant” (for “absolutely true” in green) to “helt feil” (“completely false” in red). But during a recent redesign, Faktisk phased out its ratings. 

“The decision to move away from the traditional scale was hard and subject to a very long discussion and consideration within the team,” said editor-in-chief Kristoffer Egeberg in an email. “Many felt that a rigid system where conclusions had to ‘fit the glove’ became kind of a straitjacket, causing us to either drop claims that weren’t precise enough or too complex to fit into one fixed conclusion, or to instead of doing the fact-check – simply write a fact-story instead, where a rating was not needed.”

Egeberg also noted that sometimes “the color of the ratings became the main focus rather than the claim and conclusion itself, derailing the important discussion about the facts.”

We plan to examine this trend in the future and expect this discussion may emerge during the conversations at the annual Global Fact summit in Oslo, Norway, next week. 

The Duke Reporters’ Lab began keeping track of the international fact-checking community in 2014, when it organized a group of about 50 people who gathered in London for what became the first Global Fact meeting. This year about 10-times that many people – 500 journalists, technologists, truth advocates and academics – are expected to attend the ninth summit. The conferences are now organized by the International Fact-Checking Network, based at the Poynter Institute in St. Petersburg, Florida. This will be the group’s first large in-person meeting in three years.

Fact-Checkers by Continent

Fact-Checkers by Continent
Duke Reporters’ Lab

Like their audiences, the fact-checkers are a multilingual community, and many of these sites publish their findings in multiple languages, either on the same site or in some cases alternate sites. English is the most common, used on at least 166 sites, followed by Spanish (55), French (36), Arabic (14), Portuguese (13), Korean (13), German (12) and Hindi (11).

Nearly two-third of the fact-checkers are affiliated with media organizations (226 out of 378, or about 60%). But there are other affiliations and business models too, including 24 with academic ties and 45 that are part of a larger nonprofit or non-governmental organization. Some of these fact-checkers have overlapping arrangements with multiple organizations. More than a fifth of the community (86 out of 378) operate independently.

About the census: 

Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. That’s why the updated numbers for earlier years in this report are higher than the counts the Lab included in earlier reports. If you have questions, updates or additions, please contact Mark Stencel, Erica Ryan or Joel Luther.

Related links: Previous fact-checking census reports

April 2014

January 2015

February 2016

February 2017

February 2018

June 2019

June 2020

June 2021

Comments closed

U.S. fact-checkers gear up for 2020 campaign

With the U.S. election now less than a year away, at least four-dozen American fact-checking projects plan to keep tabs on claims by candidates and their supporters – and a majority of those fact-checkers won’t be focused on the presidential campaign.

The 50 active U.S. fact-checking projects are included in the latest Reporters’ Lab tally of global fact-checking, which now shows 226 sites in 73 countries. More details about the global growth below.

Of the 50 U.S. projects, about a third (16) are nationally focused. That includes independent fact-checkers such as FactCheck.org, PolitiFact and Snopes, as well as major news media efforts, including the Associated Press, The Washington Post, CNN and The New York Times. There also are a handful of fact-checkers that are less politically focused. They concentrate on global misinformation or specific topic areas, from science to gossip.

At least 31 others are state and locally minded fact-checkers spread across 20 states. Of that 31, 11 are PolitiFact’s state-level media partners. A new addition to that group is WRAL-TV in North Carolina — a commercial TV station that took over the PolitiFact franchise in its state from The News & Observer, a McClatchy-owned newspaper based in Raleigh. Beyond North Carolina, PolitiFact has active local affiliates in California, Florida, Illinois, Missouri, New York, Texas, Vermont, Virginia, West Virginia and Wisconsin.

The News & Observer has not abandoned fact-checking. It launched a new statewide initiative of its own — this time without PolitiFact’s trademarked Truth-O-Meter or a similar rating system for the statements it checks. “We’ll provide a highly informed assessment about the relative truth of the claims, rather than a static rating or ranking,” The N&O’s editors said in an article announcing its new project.

Among the 20 U.S. state and local fact-checkers that are not PolitiFact partners, at least 13 use some kind of rating system.

Of all the state and local fact-checkers, 11 are affiliated with TV stations — like WRAL, which had its own fact-checking service before it joined forces with PolitiFact this month. Another 11 are affiliated with newspapers or magazines. Five are local digital media startups and two are public radio stations. There are also a handful of projects based in academic journalism programs. 

One example of a local digital startup is Mississippi Today, a non-profit state news service that launched a fact-checking page for last year’s election. It is among the projects we have added to our database over the past month.

We should note that some of these fact-checkers hibernate between election cycles. These seasonal fact-checkers that have long track records over multiple election cycles remain active in our database. Some have done this kind of reporting for years. For instance, WISC-TV in Madison, Wisconsin, has been fact-checking since 2004 — three years before PolitiFact, The Washington Post and AP got into the business.

One of the hardest fact-checking efforts for us to quantify is run by corporate media giant TEGNA Inc. which operates nearly 50 stations across the country. Its “Verify” segments began as a pilot project at WFAA-TV in the Dallas area in 2016. Now each station produces its own versions for its local TV and online audience. The topics are usually suggested by viewers, with local reporters often fact-checking political statements or debunking local hoaxes and rumors. 

A reporter at WCNC-TV in Charlotte, North Carolina, also produces national segments that are distributed for use by any of the company’s other stations. We’ve added TEGNA’s “Verify” to our database as a single entry, but we may also add individual stations as we determine which ones do the kind of fact-checking we are trying to count. (Here’s how we decide which fact-checkers to include.)

A Global Movement

As for the global picture, the Reporters’ Lab is now up to 226 active fact-checking projects around the world — up from 210 in October, when our count went over 200 for the first time. That is more than five times the number we first counted in 2014. It’s also more than double a retroactive count for that same year –- a number that was based on the actual start dates of all the fact-checking projects we’ve added to the database over the past five years (see footnote to our most recent annual census for details).

The growth of Agence France-Presse’s work as part of Facebook’s third-party-fact checking partnership is a big factor. After adding a slew of AFP bureaus with dedicated fact-checkers to our database last month, we added many more — including Argentina, Brazil, Colombia, Mexico, Poland, Lebanon, Singapore, Spain, Thailand and Uruguay. We now count 22 individual AFP bureaus, all started since 2018.

Other recent additions to the database involved several established fact-checkers, including PesaCheck, which launched in Kenya in 2016. Since then it’s added bureaus in Tanzania in 2017 and Uganda in 2018 — both of which are now in our database. We added Da Begad, a volunteer effort based in Egypt that has focused on social media hoaxes and misinformation since 2013. And there’s a relative newcomer too: Re:Check, a Latvian project that’s affiliated with a non-profit investigative center called Re:Baltica. It launched over the summer. 

Peru’s OjoBiónico is back on our active list. It resumed fact-checking last year after a two-year hiatus. OjoBiónico is a section of OjoPúblico, a digital news service that focuses on an investigative reporting service.

We already have other fact-checkers we plan to add to our database over the coming weeks. If there’s a fact-checker you know about that we need to update or add to our map, please contact Joel Luther at the Reporters’ Lab.

Comments closed

Number of fact-checking outlets surges to 188 in more than 60 countries

The number of fact-checking outlets around the world has grown to 188 in more than 60 countries amid global concerns about the spread of misinformation, according to the latest tally by the Duke Reporters’ Lab.

Since the last annual fact-checking census in February 2018, we’ve added 39 more outlets that actively assess claims from politicians and social media, a 26% increase. The new total is also more than four-times the 44 fact-checkers we counted when we launched our global database and map in 2014.

Globally, the largest growth came in Asia, which went from 22 to 35 outlets in the past year. Nine of the 27 fact-checking outlets that launched since the start of 2018 were in Asia, including six in India. Latin American fact-checking also saw a growth spurt in that same period, with two new outlets in Costa Rica, and others in Mexico, Panama and Venezuela.

The actual worldwide total is likely much higher than our current tally. That’s because more than a half-dozen of the fact-checkers we’ve added to the database since the start of 2018 began as election-related partnerships that involved the collaboration of multiple organizations. And some those election partners are discussing ways to continue or reactivate that work – either together or on their own.

Over the past 12 months, five separate multimedia partnerships enlisted more than 60 different fact-checking organizations and other news companies to help debunk claims and verify information for voters in Mexico, Brazil, Sweden, Nigeria and Philippines. And the Poynter Institute’s International Fact Checking Network assembled a separate team of 19 media outlets from 13 countries to consolidate and share their reporting during the run-up to last month’s elections for the European Parliament. Our database includes each of these partnerships, along with several others – but not each of the individual partners. And because they were intentionally short-run projects, three of these big partnerships appear among the 74 inactive projects we also document in our database.

Politics isn’t the only driver for fact-checkers. Many outlets in our database are concentrating efforts on viral hoaxes and other forms of online misinformation – often in coordination with the big digital platforms on which that misinformation spreads.

We also continue to see new topic-specific fact-checkers such as Metafact in Australia and Health Feedback in France — both of which launched in 2018 to focus on claims about health and medicine for a worldwide audience.

(Here’s how we decide which fact-checkers to include in the Reporters’ Lab database.)

Fact-Checkers by Continent Since Feb. 2018

Africa: 4 to 9
Asia: 22 to 35
Australia: 3 to 5
Europe: 52 to 61
North America: 53 to 60
South America: 15 to 18

TRACKING THE GROWTH

As we’ve noted, elections are not the only draw for aspiring fact-checkers. Many outlets in our database are concentrating their efforts on viral hoaxes and other forms of online misinformation – often in coordination with the big digital platforms on which that misinformation spreads. And those platforms are also providing incentives.

In one such effort, the Reporters’ Lab worked with Google and Schema.org to develop ClaimReview, an open-source tagging system for fact-checks. Google, Microsoft’s BING, Facebook and YouTube use this system to help identify and showcase fact-checkers’ work in their news feeds and search results – a process that generates traffic and attention for the fact-checkers. It also provides data that is powering experiments in live, real time fact-checks that can be delivered to users automatically. (Disclosure: Google and Facebook are among the funders of the Reporters’ Lab.)

Another driver: Facebook. It has recruited independent fact-checking partners around the world to help identify misinformation on its platforms. The social network began that effort in late 2016 with help from the Poynter’s Institute’s IFCN. (Poynter is a journalism training and research center in St. Petersburg, Florida, that also is home to the U.S. fact-checking site PolitiFact.)

Meanwhile, YouTube gave fact-checking a boost in India when it started putting fact-checks at the top of YouTube search results, which helped contribute to a surge of new outlets in that country. Now India has 11 entries in our database, six of which launched since our February 2018 census. And it’s likely there are others to add in the next few weeks.

KINDS OF FACT-CHECKERS

A bit more than half of fact-checkers are part of a media company (106 of 188, or 56%). That percentage has been dropping over the past few years, mostly because of the changing business landscape for media companies in the United States. In our 2018 census, 87% of the U.S. fact-checkers were connected to a media company (41 out of 47). Now it’s 65% (39 out of 60). In other words, as the number of fact-checker in the U.S. has grown, fewer of them have ties to those companies.

Among fact-checkers in the rest of the world, the media mix remains about half and half (67 out of 128, or 52% — very close to the 54% we saw in 2018).

The fact-checkers that are not part of a larger media organization include independent, standalone organizations, both for-profit and non-profit (the definitions of these legal and economic entities vary greatly from country to country). Some of these fact-checkers are subsidiary projects of bigger organizations that focus on civil society and political accountability. Others are affiliated with think tanks and academic institutions.

Among the recent additions is the journalism department at the University of the Philippines’ College of Mass Communication, which was the coordinator of Tsek.ph, a political fact-checking partnership mentioned earlier that also involves two other academic partners.

In the United States, we here at the Duke Reporters’ Lab joined forces last year with PolitiFact’s North Carolina partner, The News & Observer in Raleigh, to report and freely distribute fact-checks to other media across the state. Two of PolitiFact’s other recent local news partners are affiliated with academic institutions too: West Virginia University’s Reed College of Media and the University of Missouri’s journalism program. The Missouri School of Journalism also has a similar link to KOMU-TV, a local NBC affiliate in Columbia whose investigations unit did some fact-checking of its own during the 2018 midterm elections.

RATINGS

About 70% of the fact-checkers (131 of 188) have well-defined rating systems for categorizing the claims they investigate — similar to what we’ve seen in past years.

NETO
Spondeo Media’s NETO, the cartoon lie detector.

As usual, we found many of the rating systems to be entertaining. One of our new favorites comes from Spondeo Media in Mexico, which launched in December. It supplements a basic, four-point, true-to-false scale with a mascot – NETO, a cartoon lie-detector who smiles and jumps for joy with true claims but gets steamed with false ones. Another, India Today Fact Check, rated claims using a scale of one-to-three animated crows, along with a slogan in Hindi: “When you lie, the crow bites” (also the title of a popular movie: “Jhooth bole kauva kaate”).

We decided to time this year’s fact-checking census to correspond with the sixth annual GlobalFact Summit, which begins next week in Cape Town, South Africa. About 250 attendees from nearly nearly 60 countries are expected at this year’s gathering — which is yet another measure of fact-checking’s continued growth: That’s five times the number from the first GlobalFact in London in 2014.

Joel Luther, Share the Facts Research and Outreach Coordinator at the Duke Reporters’ Lab, and former student researcher Daniela Flamini (now an intern at the Poynter Institute’s International Fact Checking Network) contributed to this report.

FOOTNOTE: ANOTHER WAY TO COUNT FACT-CHECKERS?

A challenge we have each time the Duke Reporters’ Lab conducts our annual fact-checking censuses is that our final tally depends so much on when we happen to discover these outlets.  Our counting also depends on when fact-checkers come and go — especially short-term, election-focused projects that last several months. If a fact-checker was hard at work most of the year covering a campaign, but then closed up shop before we did our census, they’ll still be counted — but in our list of inactive projects.

That inactive list is an interesting trove of good ideas for other fact-checkers to mine. It also provides an entirely different way for us to tally fact-checkers: by counting all the projects that were active at some point during the year — not just the ones that make it to winter.

This approach might better showcase the year in fact-checking. And it also would show that fact-checking was in fact growing faster than we even thought it was.

Here’s chart that compares the number of fact-checkers that we know were active in certain years — even the ones that ultimately closed down — with the subsequent census number for that year….

There are reasons why the Reporters’ Lab would still need to keep counting fact-checkers the way we have since 2014. For one, we need current lists and counts of serious fact-checking projects for all kinds of reasons, including academic research and the experiments that we and others want to try out with real-world fact-checkers.

And yet it’s still great to see how fast fact-checking is growing — even more than we sometimes thought.

(The small print for anyone who’s fact-checking me: The adjusted numbers shown here combine any fact-checker in our database that was active at some point during that given year. Most of our census reports were meant to count the previous year’s activity. For example our February 2018 census appears in this chart as our count of 2017 fact-checkers, even if some of those 2017 fact-checkers were only counted in last year’s census as inactive by the time the census was published. The number shown for 2018 is the 16-month 2018-19 number we are releasing in this report. You also might note that some other numbers here are slightly off from data we’ve previously shared. The main reason is that this proposed form of counting depends on having the dates that each project began and ended. In a handful of cases, we do not.)

Comments closed

In Buenos Aires, a discussion about the impact of fact-checking

Donald Trump’s rise to the Republican nomination for president of the United States, seemingly immune to fact-checkers that debunk his false statements, has prompted a simple question about American politics: Do facts matter?

Four researchers attempted to answer this question at the Global Fact-Checking Summit in Buenos Aires during a panel discussion moderated by Alexios Mantzarlis, the director of the International Fact-Checking Network. The presenters showed evidence that fact-checking has an impact on both politicians and some voters, but they agreed that many people use fact-checks to support preexisting ideologies.

Jason Reifler, a professor of politics at the University of Exeter who specializes in fact-checking, sent letters to an experimental group of state legislators in states with PolitiFact franchises warning about fact-checking and cautioning them to make accurate claims. Reifler and Dartmouth professor Brendan Nyhan found that politicians who received letters were less likely to make false claims than politicians who did not receive a letter.

This study indicates that fact-checking matters to politicians, but it is still unclear how much it matters to voters. Reifler noted that motivated reasoning and selective exposure often cloud voters’ opinions of fact-checking.

“People will go to media and media sources that are more congenial to what they want to hear,” Reifler said. “When people encounter information, if they have a directional goal, they want to try and be consistent with it. They want to maintain their ideological priors and they want to maintain their political preferences.”

In a separate study focused on voters, Reifler showed that people pay attention to fact-checks but are more likely to read the ones that refute the politicians they oppose. He presented participants with the option to

The panel include Jason Reifler of the University of Exeter, moderator Alexios Mantzarlis, Eugenia Mitchelstein of Universidad de San Andrés in Argentina and Leticia Bode of Georgetown University.
The panel include Jason Reifler of the University of Exeter, moderator Alexios Mantzarlis, Eugenia Mitchelstein of Universidad de San Andrés in Argentina and Leticia Bode of Georgetown University.

read either two fact-checks by Pagella Politica, one against a politician from either the left or right and an unrelated article, or two articles unrelated to politics and fact-checking.

Forty-three percent of respondents chose to read both fact-checks and 83 percent read at least one, but of the 40 percent that only read one along with an unrelated article, the majority chose the fact-check that criticized a politician they opposed..

Leticia Bode, a Georgetown University professor specializing in misinformation and social media, and Eugenia Mitchelstein, a researcher at the Universidad de San Andrés in Argentina, agreed that confirmation bias plays a major role in how consumers approach falsehoods, but both presenters noted that fact-checking sometimes changes their minds.

Bode’s research tested responses to inaccurate information on Facebook and whether links to related stories and comments are effective in correcting users who believe misinformation. Among people that are less prone to believe conspiracies, seeing a headline from at least one reputable fact-checking source usually made them change their minds and believe the truth. But comments by other users contesting false claims without evidence did not have an effect.

“If you correct without sources, people don’t care at all,” Bode said. “If you are talking to your friends on Facebook who are posting, make sure you include a source.”

That corrective source for Argentinians is often Chequeado, the highly regarded fact-checking site, as Mitchelstein demonstrated with a survey of people who casually followed politics. Many respondents said people cherry-pick the data they want to believe from Chequeado, but there was still a consensus that the site plays an important role in Argentine politics.

“In Argentina, Chequeado is synonymous with fact-checking,” Mitchelstein said. “They became like the arbiter of truth, and I think it’s great thing.”

Although fact-checkers receive more attention during campaigns, many still struggle to drive traffic to their sites. Chris Blow from Meedan, which builds digital tools for journalism, provided recommendations for how fact-checkers can make articles more visually appealing and persuasive.

Blow lauded Animal Politico for its engaging graphics ratings statements via dog illustrations, inspired by the site’s name, “El Sabueso,” or “The Hound.” He also praised Africa Check and Les Observateurs, a French site, for showing clear ratings on their Twitter posts to make sure  readers knew the conclusions. Blow also critiqued posts from other publications that he said may bore or confuse readers due to too much text or a misleading placement of the rating.

 

Comments closed

Global fact-checking up 50% in past year

The high volume of political truth-twisting is driving demand for political fact-checkers around the world, with the number of fact-checking sites up 50 percent since last year.

The Duke Reporters’ Lab annual census of international fact-checking currently counts 96 active projects in 37 countries. That’s up from 64 active fact-checkers in the 2015 count. (Map and List)

Active Fact-checkers 2016A bumper crop of new fact-checkers across the Western Hemisphere helped increase the ranks of journalists and government watchdogs who verify the accuracy of public statements and track political promises. The new sites include 14 in the United States, two in Canada as well as seven additional fact-checkers in Latin America.There also were new projects in 10 other countries, from North Africa to Central Europe to East Asia.

With this dramatic growth, politicians in at least nine countries will have their statements scrutinized before their voters go to the polls for national elections this year. (In 2015, fact-checkers were on the beat for national elections in 11 countries.)

Active fact-checkers by continent in our latest tally:
Africa: 5
Asia: 7
Australia: 2
Europe: 27
North America: 47
South America: 8

More than a third of the currently active fact-checkers (33 of 96) launched in 2015 or even in the first weeks of 2016.

The Reporters’ Lab also keeps tabs on inactive fact-checking ventures, which currently number 47. Some of them assure us they are in suspended animation between election cycles — a regular pattern that keeps the fact-checking tally in continuous flux. At least a few inactive fact-checkers in the United States have been “seasonal” projects in past elections. The Reporters’ Lab regularly updates the database, so the tallies reported here are all as of Feb. 15, 2016.

Growing Competition

U.S. fact-checkers dominate the Reporters’ Lab list, with 41 active projects. Of these, three-quarters (30 of 41) are focused on the statements of candidates and government officials working at the state and local level. And 15 of those are among the local media organizations that have joined an expanding network of state affiliates of PolitiFact, the Pulitzer Prize-winning venture started nine years ago by the Tampa Bay Times in St. Petersburg, Florida.

(Editor’s Note: PolitiFact founder Bill Adair is a Duke professor who oversees the Reporters’ Lab work. The Lab is part of the the DeWitt Wallace Center for Media & Democracy at Duke’s Sanford School of Public Policy.)

In the past year, PolitiFact’s newspaper and local broadcast partners have launched new regional sites in six states (Arizona, California, Colorado, Iowa, Missouri and Nevada) and reactivated a dormant one in a seventh state (Ohio).

In some cases, those new fact-checkers are entering competitive markets. So far this election year, at least seven U.S. states have more than one regional fact-checker and in California there are three.

With the presidential campaign underway, competition also is increasing at the national level, where longstanding fact-checkers such as FactCheck.org, PolitiFact and the Washington Post Fact Checker now regularly square off with at least eight teams of journalists who are systematically scrutinizing the the candidates’ words. And with more and more newsrooms joining in, especially on debate nights, we will be adding to that list before the pixels dry on this blog post.

Competition is on the rise around the world, too. In 10 other countries, voters have more than one active fact-checker to consult.

The tally by country:
U.S.: 41
France: 5
U.K.: 4
Brazil: 3
Canada: 3
South Korea: 3
Spain: 3
Argentina: 2
Australia: 2
Tunisia: 2*
Ukraine: 2

* One organization in Tunisia maintains two sites that track political promises (a third site operated by the same group is inactive).

The growing numbers have even spawned a new global association, the International Fact-Checking Network hosted by the Poynter Institute, a media training center in St. Petersburg, Florida.

Promises, Promises

Some of the growth has come in the form of promise-tracking. Since January 2015, fact-checkers launched six sites in five countries devoted to tracking the status of pledges candidates and party leaders made in political campaigns. In Tunisia, there are two new sites dedicated to promise-tracking — one devoted to the country’s president and the other to its prime minister.

There are another 20 active fact-checkers elsewhere that track promises, either as their primary mission or as part of a broader portfolio of political verification. Added together, more than a quarter of the active fact-checkers (26 of 96, including nine in the United States) do some form of promise-tracking.

The Media Is the Mainstream — Especially in the U.S.

Nearly two-thirds of the active fact-checkers (61 of 96, or 64 percent) are directly affiliated with a new organization. However this breakdown reflects the dominant business structure in the United States, where 90 percent of fact-checkers are part of a news organization. That includes nine of 11 national projects and 28 of 30 state/local fact-checkers

Media Affiliations of 41 Active U.S. Fact-Checkers
Newspaper: 18
TV: 10
TV + Newspaper: 1
Radio: 3
Digital: 3
Student Newspaper: 1
Not Affiliated: 4

The story is different outside the United States, where less than half of the active fact-checking projects (24 of 55, or 44 percent) are affiliated with news organizations.

The other fact-checkers are typically associated with non-governmental, non-profit and activist groups focused on civic engagement, government transparency and accountability. A handful are partisan, especially in conflict zones and in countries where the lines between independent media, activists and opposition parties are often blurry and where those groups are aligned against state-controlled media or other governmental and partisan entities.

Many of the fact-checkers that are not affiliated with news organizations have journalists on their staff or partner with professional news outlets to distribute their content.

All About Ratings

More than three out of four active U.S. fact-checkers (33 of 41, or 81 percent) use rating systems, including scales that range from true to false or rating devices, such as the Washington Post’s “Pinocchios.” That pattern is consistent globally, where 76 of 96, or 79 percent, use ratings.

This report is based on research compiled in part by Reporters’ Lab student researchers Jillian Apel, Julia Donheiser and Shaker Samman. Alexios Mantzarlis of the Poynter Institute’s International Fact-Checking Network (and a former managing editor of the Italian fact-checking Pagella Politica) also contributed to this report, as did  Reporters’ Lab director Bill Adair, Knight Professor for the Practice of Journalism and Public Policy at Duke University (and founder of PolitiFact).

Please send updates and additions to Reporters’ Lab co-director Mark Stencel (mark.stencel@duke.edu).

Comments closed

Snapshot of fact-checking around the world, July 2015

Fact-checking continues to grow around the world.

As we convene the second annual Global Summit of Fact-Checking in London this week, there are now 64 active sites, up from 44 a year ago.

Here’s a snapshot of the latest numbers from the Duke Reporters’ Lab database. Last year’s numbers are in parentheses.

  • Active fact-checking sites: 64 (44)
  • Total sites that have been active in past few years*: 102 (59)
  • Sites that are affiliated with news organizations: 63 percent
  • Percentage of sites that use rating systems such as meters or labels: 80 (70)
  • Number of active sites that track politicians’ campaign promises: 21 of 64

*Some sites have been active only for elections or have been suspended because of lack of funding. We still include the dormant sites in our database because they often resume operation.

Comments closed

Canadian Press fact-checkers find politicians full of baloney

While many fact-checkers around the world rate the accuracy of statements on a true-to-false scale, the team at the Canadian Press rates them by their value in meat.

The Canadian Press Baloney Meter is the world’s only sausage-based rating system, a lighthearted scale that goes from No Baloney (true) to Full of Baloney (false). The scale is inspired by the old saying about someone telling a lie.

“It’s kind of a throwback,” said Canadian Press Ottawa Bureau Chief Heather Scoffield, but the ratings don’t mean the work is frivolous. The Canadian Press fact-checks explore important topics and are backed by thorough research.

A little baloney (from Canadian Press)
“A little baloney” on the Baloney Meter means “the statement is mostly accurate but more information is required.”

The Baloney Meter can be “silly, but the piece itself is the furthest thing from being silly,” Scoffield said.

Scoffield launched the fact-checking service last spring after months of deliberation. The fact-checks started just in time for Ontario’s general election last June.

The provincial race was something of a practice round for the Canadian federal election this fall. Scoffield said she plans to increase the number of fact-checks as the election nears.

“There’s enough baloney out there that we could ramp up,” she said.

Currently, most Baloney Meter fact-checks examine statements by  officials in the federal government, but Scoffield said the focus will shift to political parties during the election.

Unlike most fact-checking efforts, the Baloney Meter has no dedicated staff or website. The checks are done by reporters for the wire service and the content is sent to subscribing news organizations for them to use in print and online. While readers can’t directly search for every meter ranking at a centralized location, the broad reach of the wire service gives the fact-checks wide exposure.

The biggest challenge for the Canadian fact-checkers has been the difficulty getting public data.

“This government is not known for being open,” Scoffield said. “It places a limitation on us for what we can actually fact check. We choose our topics accordingly.”

A lot of baloney (from Canadian Press)
“A lot of baloney” on the Baloney Meter means the statement “is mostly inaccurate but contains elements of truth.”

When fact-checkers determine there is inadequate information, they use Baloney Meter’s “Some baloney” rating. Scoffield said the Baloney Meter has earned a good reputation in the Canadian government. She said that politicians like the attention to the substance of the policy rather than the theatrics surrounding it.

“Even if they [politicians] don’t come out looking great, they appreciate that we’re talking about the substance of it,” Scoffield said.

With that kind of impact comes a great deal of responsibility. Scoffield said she knows that even the slightest slip up could lead to criticism.

“You have to do it well, or you lose your credibility,” Scoffield said. “We absolutely can’t take sides. We have to deal strictly with the facts.”

Comments closed

Poligraph: Building a fact-checking brand in Minnesota

Catharine Richert’s boss once told her that she had the hardest job in the newsroom.

As the sole reporter working on Poligraph, Minnesota Public Radio’s fact-checking feature, Richert investigates claims made by state politicians and rates them Accurate, Misleading, Inconclusive or False. She publishes her fact-checks on the MPR website and discusses her fact-checks on the air Friday afternoons.

Five years after Richert started it, Poligraph has become a well-known part of MPR’s political coverage. Although refereeing Minnesota’s often sharp-elbowed politics is no easy task, Richert has managed to make Poligraph a success.

“MPR has been able to build a very specific brand around what we do that’s very recognizable to our audience,” she said.

Despite the limitations of running a one-woman show, Richert believes that being the single voice gives her credibility and consistency on the radio.

Richert
Catharine Richert

“I think with radio that one single voice reporting on something is all that much more important.”

Poligraph began as a joint initiative between MPR and the Humphrey School of Public Affairs at the University of Minnesota in 2010. Richert, a grad student at the Humphrey School at the time, worked for Poligraph part-time while in school. Her previous experience working for PolitiFact in Washington, D.C. helped prepare her for the job. When she graduated in May 2011, MPR offered her a full-time position.

MPR’s affiliation with the Humphrey school ended, but Richert kept the feature going.

To determine which claims to check each week, Richert discusses possibilities with her editor. Their most important criteria is that the claim was in the news that week.

“Other than that, we fact-check things that make us curious,” she said. “Most weeks, we try to check one Republican and one Democrat, and we’re pretty strict about that.”

Although the three other reporters on the MPR politics team keep their eyes open for ideas, Richert and her editor are the primary contributors.

They began with three ratings — Accurate, False and Inconclusive — and added Misleading.

She said that Poligraph also started incorporating their sourcing directly into the story, instead of listing it at the end, and fine-tuned her radio appearances.  

“I think we’ve gotten a lot better about being clear and concise on the air and just hitting the top things people need to know,” she said.

Richert said that fact-checking in Minnesota is different than at the national level because she can have more impact.

“Occasionally, people will just stop using a talking point after we do what we do,” she said. “It happens a little more often here than it did when I was working in Washington.”

She has found that politicians in Minnesota are more responsive to fact-checkers than the politicians she dealt with in Washington while working for PolitiFact.

“People here are far more willing to be transparent about where they’re getting their information,” she said. “It’s rare when someone doesn’t respond to an email.”

Richert noted that Minnesotans are especially engaged in politics and want to hold their politicians accountable.

“People are really interested in policies,” she said. “They want to know the details behind some of the things that people say.”

Richert said that most of the reaction to Poligraph has been positive and that people enjoy the feature on the radio.

“I certainly get my share of angry emails,” she said, “but I think that at the end of the day, people appreciate being more well-versed in what the facts are whether they agree with them or not.”

Richert said the success of Poligraph shows that it doesn’t take a giant staff to hold politicians accountable.

“You don’t have to have this elaborate set-up to fact-check,” she said. “You can simply do it through reporting — and that’s what all reporters should be doing.”

Comments closed