Press "Enter" to skip to content

Tag: local fact-checking

Vast gaps in fact-checking across the U.S. allow politicians to elude scrutiny

Click here to read the full report.

The candidates running last year for an open seat in Ohio’s 13th Congressional District exchanged a relentless barrage of scathing claims, counterclaims and counter-counterclaims.

Emilia Sykes was a former Democratic leader in the state legislature who came from a prominent political family. Her opponent called Sykes a lying, liberal career politician who raised her own pay, increased taxes on gas and retirement accounts, and took money from Medicare funds to “pay for free healthcare for illegals.” Other attack ads warned voters that the Democrat backed legislation that would release dangerous criminals from jail.1

Sykes’ opponent, Republican Madison Gesiotto Gilbert, was an attorney, a former Miss Ohio, and a prominent supporter of former President Donald Trump. Sykes’ and her backers called Gilbert a liar who would “push for tax cuts for millionaires” and slash Social Security and Medicare. Gilbert backed a total abortion ban with no exceptions, they warned (“not even if the rape victim is a 10 year old girl”) and she had the support of political groups that aim to “outlaw birth control.”2

Voters in one of the country’s most contested U.S. House races heard those allegations over and over — in TV ads, social media posts and from the candidates themselves.

But were any of those statements and allegations true? Who knows?

Ohio was one of 25 states where no statewide or local media outlet consistently fact-checked political statements. So voters in the 13th District were on their own to sort out the truth and the lies. 

But their experience was not unique. Throughout the country, few politicians had to worry about being held accountable for exaggerations or lies in ads or other claims during the campaign. 

An extensive review by the Duke Reporters’ Lab of candidates and races that were fact-checked found only a small percentage of politicians and public officials were held accountable for the accuracy of what they said.

The results were striking.

Governors were the most likely elected officials to face review by fact-checkers at the state and local level. But still fewer than half of the governors had even a single statement checked (19 out of 50).

For those serving in Congress, the chances of being checked were even lower. Only 33 of 435 U.S. representatives (8%) were checked. In the U.S. Senate, a mere 16 of 100 lawmakers were checked by their home state news media.

The smaller the office, the smaller the chance of being checked. Out of 7,386 state legislative seats, just 47 of those lawmakers were checked (0.6%). And among the more than 1,400 U.S. mayors of cities of 30,000 people or more, just seven were checked (0.5%).

These results build on an earlier Reporters’ Lab report3 immediately after the election, which showed vast geographical gaps in fact-checking at the state and local level. Voters in these “fact deserts” have few, if any, ways to keep up with misleading political claims on TV and social media. Nor can they easily hold public officials and institutions accountable for any inaccuracies and disinformation they spread.

A color coded map showing which states have active local fact-checking projects.

Longstanding national fact-checking projects fill in some of the gaps. FactCheck.org, PolitiFact, The Washington Post, and the Associated Press sometimes focus on high-profile races at the state and local level. They and other national media outlets also monitor the statements of prominent state-level politicians who have their eyes fixed on higher offices — such as the White House.

But our review of the 2022 election finds that the legacy fact-checking groups have not scaled to the vast size and scope of the American political system. Voters need more fact-checks, on more politicians, more quickly. And fact-checkers need to develop more robust and creative ways to distribute and showcase those findings.

We found big gaps in coverage, but also opportunities for some relatively easy collaborations. Politicians and campaigns repeatedly use the same lines and talking points. Fact-checkers sometimes cite each other’s work when the same claims pop up in other places and other mouths. But there’s relatively little organized collaboration among fact-checkers to quickly respond to recycled claims. Collaborative projects in the international fact-checking community offer potential templates. Technology investments would help, too.

Who’s Getting Fact-Checked?

To examine the state of regional fact-checking, the Duke Reporters’ Lab identified 50 active and locally focused fact-checking projects from 25 states and the District of Columbia.4 That count was little changed from the national election years since 2016, when an average of 46 fact-checking projects were active at the state and local level.

The fact-checking came from a mix of TV news stations, newspaper companies, digital media sites and services, and two public radio stations. PolitiFact’s state news affiliates also include two university partnerships, including a student newspaper. (See the full report for a complete list and descriptions.) 

Active Local Fact-Checking Outlets by Year

A bar chart showing growth in local fact-checking.

Journalists from those news organizations cranked out 976 fact-checks, verifying the accuracy of more than 1,300 claims from Jan. 1, 2022, to Election Day. 

But thousands more claims went unchecked. That became clear when we began to determine who was getting fact-checked.

As part of our research, we reviewed the fact-checkers’ output in text, video and audio format. We identified a “claim” as a statement or image that served as the basis of a news report that analyzed its accuracy based on reliable evidence. That included a mix of political statements as well as other kinds of fact-checks — such as local issues, social trends and health concerns.

We excluded explanatory stories that did not analyze a specific claim or reach a conclusion. Of the more than 970 fact-checks we reviewed, about 13% examined multiple claims.

The Reporters’ Lab found that a vast majority of politicians at the state and local level elude the fact-checking process, from city council to statewide office. But elected officials and candidates in some places got more scrutiny than others. 

Some interesting findings:

The most-checked politician was Iowa Gov. Kim Reynolds, a Republican. Reynolds topped the list with 28 claims checked, largely because of two in-depth articles from the Gazette Fact Checker in Cedar Rapids, which covered 10 claims from her Condition of the State address in January 2022, and another 10 from her delivery of the Republican response to President Joe Biden’s State of the Union in March.

Other more frequently checked politicians included Michigan gubernatorial challenger Tudor Dixon, a Republican (18); Cindy Axne, a Democrat who lost her bid for reelection to a U.S. House seat in Iowa (16); and incumbent U.S. Sen. Ron Johnson of Wisconsin, a Republican (16).

Also near the top of the list were former President Trump, a Republican (15), who was sometimes checked on claims during local appearances; Michigan Gov. Gretchen Whitmer, a Democrat (15); Wisconsin Gov. Tony Evers, a Democrat (15); Evers’ Republican challenger Tim Michels (14); Arizona gubernatorial candidate Kari Lake, a Republican (13); and Florida’s Republican Gov. Ron DeSantis (12).

Most-Checked Politicians

A bar chart showing which politicians are most fact-checked.

Overall, individual claims by sitting governors were checked 130 times (10% of claims); by U.S. representatives 96 times (7%); by state legislators 77 times (6%); by U.S. senators 61 times (5%); and by mayors 11 times (1%).

Most-Checked Politicians By Office Held

A bar chart showing the distribution of fact-checks by office held.

For comparison, President Joe Biden’s claims were checked more than 100 times by national fact-checkers from PolitiFact, The Washington Post and others.

While these numbers focus on direct checking of the politicians themselves, fact-checkers also analyzed claims by other partisan sources, including deep-pocket political organizations running attack ads in many races.

There was more checking of Republicans/conservative politicians and political groups (553 claims, or 42%) than Democratic/progressive groups (382 claims, or 29%). If we look strictly at the 942 claims from claimants we identified as political, 59% were Republican/conservative and 41% were Democratic/progressive. 


Read the full report here.

Our Recommendations

Fact-checking is a challenging type of journalism. It requires speed, meticulous research and a thick skin. It also requires a willingness to call things as they are, instead of hiding behind the misleading niceties of both-siderism. And yet, over the past decade, dozens of state and local news organizations have adopted this new type of journalism. 

The 50 fact-checking programs we examined during last year’s midterm election invested time, energy and money to combat political falsehoods and push back against other types of misinformation. Even at a time of upheaval in the local news business, we have seen TV news stations, newspaper companies, and nonprofit newsrooms embrace this mission.

But all this work is not enough. 

Misinformation and disinformation spread far, fast and at a scale that is almost impossible for news media fact-checkers to keep pace. If journalists aim to reestablish a common set of facts, we need to do more fact-checking. 

Our recommendations for dramatically increasing local media’s capacity for fact-checking include: 

Invest in more fact-checking 

The challenge: Despite the diligent work of local fact-checking outlets in 25 states and the District of Columbia, only a relative handful of politicians and public officials were ever fact-checked. And in half the country, there was no active fact-checking at all.

The recommendation: It is clear that an investment in this vital journalism is sorely needed. Voters in “fact desert” states like Ohio and New Hampshire will be key to the 2024 elections. And those voters should be able to trust in local journalism to provide a check on the lies that politicians are sure to peddle in political ads, debates and other campaign events.

Even in states where local fact-checking efforts exist, they are severely outmatched by a tsunami of claims, as political organizations pump billions of dollars into campaign ads, and social media messages accelerate the spread of misinformation far and wide. The low numbers of claims checked locally in the 2022 Senate races in Arizona, Georgia, Nevada and Pennsylvania demonstrate that additional help is needed in manpower and financial resources for the journalists trying to keep up with the campaign cycle.

One way to increase the volume of local fact-checking would be to incentivize projects like Gigafact and PolitiFact. These existing models can be replicated by other organizations and added in additional states. The Gigafact partners in Arizona, Nevada and Wisconsin produced dozens of 140-word “fact briefs” in the run-up to the 2022 election. These structured fact-checks, which answer yes/no questions, have proved popular with audiences. Dee J. Hall, managing editor at Wisconsin Watch, which participated in the Gigafact pilot in 2022, reported that eight of the organization’s ten most popular stories in November were fact briefs.

The journalism education community can also help. During the 2022 election, PolitiFact worked with the journalism department at West Virginia University and the student newspaper at the University of Iowa to produce fact-checks for voters in their states. Expanding that model, potentially in collaboration with other national fact-checkers, could transform most of the barren “fact deserts” we’ve described in time for the 2024 general election campaign. 

Elevate fact-checking

The challenge: Fact-checking is still a niche form of reporting. It shares DNA with explanatory and investigative journalism. But it is rarely discussed at major news media conferences. There are few forums for fact-checkers at the state and local level to compare their efforts, learn from one another and focus on their distinctive reporting problems. 

The recommendation: As we continue increasing the volume of local fact-checking, audiences and potential funders need to view fact-checking with the same importance as investigative work. Investigative reporting has been a cornerstone of local news outlets’ identity and public service mission for decades. Fact-checking should be equally revered. Both are vital forms of journalism that are closely related to each other.

Some local news outlets already take this approach, with their investigative teams also producing fact-checking of claims. For example, 4 Investigates Fact Check at KOB-TV in New Mexico is an offshoot of its 4 Investigates team, and FactFinder 12 Fact Check at KWCH-TV in Kansas uses a similar model.

Fact-checkers also can elevate their work by explaining it more forcefully — on-air, online and even in person. This is an essential way to promote trust in their work. We found that 17 state and local fact-checking efforts do not provide any explanation of their process or methodology to their audiences. Offering this kind of basic guidance does not require creating and maintaining separate dedicated “about” or methodology pages. Instead, some fact-checkers, such as ConnectiFact and the Gazette Fact Checker in Iowa, embed explanations directly within their fact-checks. In this mobile era, that in-line approach might well be more important. Likewise, as TV continues to play an increasing role in fact-checking, broadcasters also need to help their viewers understand what they’re seeing.

Embrace technology and collaborate

The challenge: Several national fact-checkers in the United States work closely together with the Reporters’ Lab, as well as other academic researchers and independent developers, to test new approaches to their work. We’ve seen that same spirit of community in the International Fact-Checking Network at the Poynter Institute, which has fostered cross-border collaborations and technology initiatives. In contrast, few state and local press in the United States have the capacity or technological know-how to experiment on their own. Fact-checking also has a low-profile in journalism’s investigative and tech circles.

The recommendation: There is a critical need for more investment in technology to assist fact-checkers at the state and local level. As bad actors push misinformation on social media and politicians take advantage of new technologies to mislead voters, an equal effort must be made to boost the truth.

AI can be leveraged to better track the spread of misinformation, such as catching repetitions of false talking points that catch on and circulate all around the country. A talking point tracker could help fact-checkers prioritize and respond to false claims that have already been fact-checked.

AI can also be leveraged to help with the debunking of false claims. Once a repeated talking point has been identified, a system using AI could then create the building blocks of a fact-check that a journalist could review and publish.

But none of these ideas will get very far unless journalists are willing to collaborate. Collaboration can cut down on duplication and allow more effort to be spent on fact-checking new claims. The use of technology would also have a greater impact if more organizations are willing to swap data and make use of each others’ research.

Make fact-checking easier to find

The challenge: Fact-checking in the United States has grown significantly since 2017. But fact-checks are still easy to miss on cluttered digital news feeds. Existing technology can help fact-checkers raise their profiles. But some state and local fact-checks don’t even have basic features that call attention to their reporting.

The recommendation: Nearly 180 fact-checking projects across the United States and  around the world have embraced open-source systems designed to provide data that elevate their work in search results and on large social media and messaging services. State and local fact-checkers should adopt this system as well.

The Reporters’ Lab joined with Google and Schema.org to develop a tagging system called ClaimReview. ClaimReview provides data that major digital platforms can use to recognize and suppress misinformation on their feeds. A second, related schema called MediaReview is generating similar data for visual misinformation. 

ClaimReview has helped feed a prominent collection of recent fact-checks on the front of the Google News page in half a dozen countries, including the U.S. But so far, most state and local fact-checking projects are not using ClaimReview. 

Meanwhile, the regional fact-checkers have even more foundational work to do. That more than a quarter of the active fact-checkers (13 of 50) have no dedicated page or tag for the public to find these stories is disappointing. Overcoming the limitations of inflexible publishing systems often make simple things hard. But all fact-checkers need to do more to showcase their work. Fact-checks have a long shelf life and enormous value to their audiences. 


This project was a team effort. The report was written and led by Mark Stencel, co-director of the Duke Reporters’ Lab, and project manager Erica Ryan. Student researchers Sofia Bliss-Carrascosa and Belén Bricchi were significant contributors, as was Joel Luther, research and outreach coordinator for ClaimReview and MediaReview at the Reporters’ Lab.

Here’s how we decide which fact-checkers to include in the Reporters’ Lab database. The Lab continually collects new information about the fact-checkers it identifies, such as when they launched and how long they last. If you know of a fact-checking project that has been missed, please contact  and  at the Reporters’ Lab.

Our thanks to Knight Foundation’s journalism program for supporting this research.

Disclosure: Stencel is an unpaid contributing editor to PolitiFact North Carolina.


1 https://nrcc.org/2022/08/31/fact-check-sykes-lies-to-oh-voters-in-first-tv-ad/

https://www.youtube.com/watch?v=pCQz6FCMMzo

https://congressionalleadershipfund.org/sykes-sided-with-criminals-over-public-safety/

2 https://host2.adimpact.com/admo/viewer/a9400662-bc20-4e34-9a44-42d478efa451/

https://dccc.org/dccc-releases-new-tv-ad-in-oh-13-wrong/

3 https://reporterslab.org/fact-deserts-leave-states-vulnerable-to-election-lies/

4 After an earlier report in November 2022, our Lab identified a few more election-year fact-checking efforts. That meant our total count for the year increased from 46 to 50. And the number of states that had fact-checking efforts in that period increased from 21 to 25.

 

Comments closed

Local fact-checking is hard to find when voters need it most

A lot of good fact-checking took place last year at the local level. But good luck finding it.

Regional fact-checkers are not using basic digital publishing practices — such as landing pages, tagging and social media — to promote their fact-checks, according to a report co-authored by Duke Reporters’ Lab co-director Mark Stencel and research coordinator Rebecca Iannucci.

The report, published by the Poynter Institute on Oct. 16, was derived from work by the Lab’s student researchers, who reviewed nearly 40 regional media outlets that fact-checked political claims during last year’s election cycle.

One of those outlets, The Topeka Capital-Journal, did have a landing page for its “Kansas Fact Meter.” But the landing page was inactive and did not showcase the majority of its fact-checks dating back to 2014, or the ones it published after January 2016.

Tim Carpenter, the Capital-Journal’s Statehouse Bureau Chief and the founder of the Kansas Fact Meter, said he did not know the fact-checking project had a landing page, but he admitted it might be beneficial to have one.

“I’ve never done a full accounting [of the Kansas Fact Meter],” Carpenter said in a phone interview with the Reporters’ Lab. “Having a page…people could go to directly — or a link to all of them — is a great idea.”

The Lab’s report noted that local news organizations that partnered with PolitiFact as one of the national fact-checker’s state affiliates got a boost from working with a website that was already structured in ways to help generate traffic. But half of the state and local fact-checking sites the Lab’s student researchers reviewed were more like the Kansas Fact Meter — a standalone project, often championed in the newsroom by handful of journalists, like Carpenter.

Carpenter said he wished regional news organizations had more technological resources and research assistance at hand in order to make fact-checks easily accessible.

“It’s probably my fault for not hitting that designation when I file stories,” he said referring to a specific tag on the Capital-Journal’s publishing platform that would easily group fact-checks on the site. “I will do better.”

Comments closed