We rate this true: The DeWitt Wallace Center will offer a new course in the spring titled “Fact-Checking American Politics.”
The course, taught by Bill Adair and Mark Stencel, will examine the growth of political fact-checking by organizations such as the Washington Post, PolitiFact and FactCheck.org, along with dozens more across the United States and around the world. Students will learn advanced techniques for researching political claims by candidates and elected officials and will write fact-check articles.
Adair, the Knight Professor of the Practice of Journalism and Public Policy, is the founder of PolitiFact and the International Fact-Checking Network. He worked 24 years for the St. Petersburg Times (now Tampa Bay Times) and covered the White House, Congress and the U.S. Supreme Court. Adair and the PolitiFact team won the Pulitzer Prize for National Reporting in 2009.
Mark Stencel is co-director of the Reporters’ Lab at Duke, where he tracks the spread and impact of fact-checking and teaches courses in political journalism. His introduction to fact-checking was working as a political researcher at The Washington Post during the 1992 presidential campaign. He has been a senior editor and media executive at The Post, Congressional Quarterly (now CQ-Roll Call) and National Public Radio.
The course will focus on independent analysis and advanced research techniques, including the importance of obtaining original documents and relying on multiple sources. Students will learn how to analyze claims and determine ratings. They also will learn how to identify, track and rate campaign promises.
Adair and Stencel will emphasize clear, well-argued, persuasive writing and well-supported fact-check ratings. They also will examine the impact of fact-checking on politicians and political discourse.
The course is listed PJMS 390S – Special Topics in Journalism. It is cross-listed as PUBPOL 290S.
Three Duke computer science majors advanced the quest for what some computer scientists say is the Holy Grail in fact-checking this summer.
Caroline Wang, Ethan Holland and Lucas Fagan tackled major challenges to creating an automated system that can both detect factual claims while politicians speak and instantly provide fact-checks.
That required finding and customizing state-of-art computing tools that most journalists would not recognize. A collective fondness for that sort of challenge helped, a lot.
“We had a lot of fun discussing all the different algorithms out there, and just learning what machine learning techniques had been applied to natural language processing,” said Wang, a junior also majoring in math.
Wang and her partners took on the assignment for a Data+ research project. Part of the Information Initiative at Duke, Data+ invites students and faculty to find data-driven solutions to research challenges confronting scholars on campus.
The fact-checking team convened in a Gross Hall conference from 9 am to 4 pm every weekday for 10 weeks to help each other figure out how to help achieve live fact-checking, a goal of Knight journalism professor Bill Adair and other practitioners of accountability journalism.
Their goal was to do something of a “rough cut” of end-to-end automated fact-checking: to convert a political speech to text, identify the most “checkable” sentences in the speech and then match them with previously published fact-checks.
The students concluded that Google Cloud Speech-to-Text API was the best available tool to automate audio transcriptions. They then submitted the sentences to ClaimBuster, a project at the University of Texas at Arlington that the Duke Tech & Check Cooperative uses to identify statements that merit fact-checking. ClaimBuster acted as a helpful filter that reduced the number of claims submitted to the database, which in turn reduced processing time.
They chose Google Cloud speech-to-text because it can infer where punctuation belongs, Holland said. That yields text divided into complete thoughts. Google speech-to-text also shares transcription results while it processes the audio, rather than waiting until translation is done. That speeds up how fast the new text can get moved to the next steps along a fact-checking pipeline.
“Google will say: This is my current take and this is my current confidence that take is right. That lets you cut down on the lag,” said Holland, a junior whose second major is statistics.
Their next step was finding ways to match the claims from that speech with the database of fact-checks that came from the Lab’s Share the Facts project. (The database contains thousands of articles published by the Washington Post, FactCheck.org and PolitiFact, each checking an individual claim.)
To do that, the students adapted an algorithm that the open-source research outfit OpenAI released in June, after the students started working together. The algorithm builds on The Transformer, a new neural network computing architecture that Google researchers published just six months prior.
The architecture alters how computers organize trying to understand written language. Instead of translating a sentence word by word, The Transformer weighs the importance of each word to the meaning of every other word. Over time that system helps machines discern meaning in more and more sentences more quickly.
“It’s a lot more like learning English. You grow up hearing it and your learn it,” said Fagan, a sophomore also majoring in math.
Work by Wang, Holland and Fagan is expected to help jumpstart a Bass Connections fact-checking team that started this fall. Students on that team will continue the hunt for better strategies to find statements that are good fact-check candidates, produce pop-up fact-checks and create apps to deliver this accountability journalism to more people.
Tech & Check has $1.2 million in funding from the John S. and James L. Knight Foundation, the Facebook Journalism Project and the Craig Newmark Foundation to tackle that job.
FactStream, our iPhone/iPad app, has a new feature that displays the latest fact-checks from FactCheck.org, PolitiFact and The Washington Post.
FactStream was conceived as an app for live fact-checking during debates and speeches. (We had a successful beta test during the State of the Union address in January.) But our new “daily stream” makes the app valuable every day. You can check it often to get summaries of the newest fact-checks and then click through to the full articles.
By viewing the work of the nation’s three largest fact-checkers in the same stream, you can spot trends, such as which statements and subjects are getting checked, or which politicians and organizations are getting their facts right or wrong.
The new version of the app includes custom notifications so users can get alerts for every new fact-check or every “worst” rating, such as Four Pinocchios from Washington Post Fact Checker Glenn Kessler, a False from FactCheck.org or a False or Pants on Fire from PolitiFact.
The new daily stream was suggested by Eugene Kiely, the director of FactCheck.org. The app was built by our lead technologist Christopher Guess and the Durham, N.C., design firm Registered Creative. It gets the fact-check summaries from ClaimReview, our partnership with Google that has created a global tagging system for fact-checking. We plan to expand the daily stream to include other fact-checkers in the future.
The app also allows users to search the latest fact-checks by the name of the person or group making the statement, by subject or keyword.
FactStream is part of the Duke Tech & Check Cooperative, a $1.2 million project to automate fact-checking supported by Knight Foundation, the Facebook Journalism Project and the Craig Newmark Foundation.
FactStream is available as a free download from the App Store.
The Duke Reporters’ Lab is launching a global effort to get more publishers to adopt ClaimReview, a schema.org open standard tagging system or “markup” that search engines and other major digital platforms use to find and highlight fact-checking articles.
The ClaimReview project, funded by a $200,000 grant from the Google News Initiative, will include a partnership with the International Fact-Checking Network, the global alliance of fact-checking organizations based at the Poynter Institute.
The Reporters’ Lab will develop instructional materials about ClaimReview and assist publishers in adopting a new tool to create the markup more easily with help from Google and Data Commons. The Lab also will work to expand the number of publishers around the world that are using ClaimReview. The IFCN will produce webinars and conduct outreach and training sessions at fact-checking conferences around the world, including the group’s annual Global Fact conference.
ClaimReview was developed three years ago through a partnership of the Reporters’ Lab, Google, and schema.org. It provides a standard way for publishers of fact-checks to identify the claim being checked, the person or entity that made the claim, and the conclusion of the article. The standardization enables search engines and other platforms to highlight the fact-checks in search results.
Google and Bing, the Microsoft search engine, both use ClaimReview to highlight fact-checking articles in search results and their own news products. Facebook announced in the summer that it plans to use ClaimReview as part of its partnership with fact-checkers.
The Reporters’ Lab uses ClaimReview as a key element in the Tech & Check Cooperative, our ambitious effort to automate fact-checking. Projects such as the FactStream app for iPhone and iPad and a new app being developed for television rely on the markup.
“ClaimReview is one of the untold success stories of the fact-checking movement,” said Bill Adair, director of the Reporters’ Lab. “It’s helping people find the facts in search results and helping fact-checkers increase their audience and impact.”
Despite the success, the Reporters’ Lab team estimates that roughly half the fact-checkers in the world still are not using the tagging system. Some fact-checkers have found it cumbersome to create the ClaimReview markup in their own publishing systems, while others have been confused about the different options for making it.
The new project will help editors switch to the new Google / Data Commons markup tool, a simpler way of generating ClaimReview, and provide technical assistance when they need it.
“We think of this project as ClaimReview 2.0,” Adair said. “This should expand the number of publishers using it, which should broaden the audience for fact-checking around the world.”