Fact checking the fact checkers

fact checkersThere are various well-established and well-known fact checkers that many rely upon, for example PolitiFact, FactCheck.org, or the Washington Post’s Fact Checker. However, one has to wonder if anybody independently fact-checks the fact checkers.

I was curious about this and so looked into it to see if anybody had seriously researched it. To be honest, I would have been astonished to discover that nobody had. That however is not how things panned out because indeed yes it is a topic of interest.

The 2015 Study

Published within the Journal of Politics and Society is a paper entitled “Revisiting the Epistemology of Fact-Checking”. This is a response to a previous 2013 paper by Joseph E. Uscinski and Ryden W. Butler which had argued that fact-checking was useless because the methods fact-checkers used to select statements, consider evidence, and render judgment fails to stand up to the rigors of scientific inquiry.

This 2015 paper successfully argues that the previous paper was seriously flawed because they had sampled from sources that do not consistently fact-check on a regular basis and also that many of the criticisms it raised were not actually backed up by any empirical evidence.

What exactly did the study do?

The looked at political ads for the 2008 and 2012 presidential elections …

political ads from the 2008 and 2012 presidential election serve as the framework within which fact-checking is examined. Because they offer a bounded source of planned, strategically crafted messaging by political figures, political ads minimize the likelihood of off-the-cuff remarks or gaffes. Other forms of political communication can be less predictable with interruptions or opportunities to misspeak as well as be less precise

They then evaluated three very specific fact-checking organisations and eliminated others because they had no consistent methodology or were no longer fact-checking …

Ad claim accuracy is based upon the evaluations of FactCheck.org, PolitiFact.com, and the Washington Post’s Fact Checker. Other organizations such as Spinsanity and newspapers such as the New York Times have conducted fact-checking. However, Spinsanity disbanded prior to the 2008 election, and the political ad coverage of the Times did not use consistent methods to assess the accuracy of the ads

they are enduring fact-checking organizations on a national level that continue to operate beyond specific election cycles. Furthermore, they are considered the three elite, national fact-checkers

OK, so what exactly did they discover?

Well,  this is what they examined …

Among the 491 collected television ads from both elections, the three fact-checkers published evaluations of 192 of them. Many of the ads, however, were only evaluated by a single fact-checker. Across both elections, only 65 ads drew evaluations from at least two of the fact- checkers. Within these 65 ads, 150 different claims were scrutinized.

… and the result of this analysis is that they all more or less agreed. To be precise, the percentage of agreement was in the range 98-100%

That is very encouraging.

They do not conspire to align and instead independently reached more or less the exact same conclusions using distinctly different ways of fact checking.

What this does not tell us is that all fact-checkers get it right all the time, but instead, when it comes to evaluating claims in political ads from presidential races, the leading U.S. fact-checkers are highly consistent in their assessments.

Who are the popular Fact-checkers today?

In a US context, while there are of course others, it is basically these that are the mainstream primary go-to sources …

FactCheck.org

This is non-partisan and nonprofit website that is self-described as “advocates for voters that aims to reduce the level of deception and confusion in U.S. politics,

Fact Checker (The Washington Post)

A project of The Washington Post, known for grading politicians on the factual accuracy of their statements with one to four “Pinocchios.”

  • Created September 2007 by Post diplomatic writer Michael Dobbs specifically for the 2008 presidential campaign.
  • Ceased operation 4 November 2008.
  • Relaunched with a broader focus in January 2011, led by veteran Post diplomatic correspondent Glenn Kessler.
  • Rates statements by politicians, usually on a range of one to four Pinocchios—with one Pinocchio for minor shading of the facts and four Pinocchios for outright lies. If the statement is truthful, the person will get a rare “Geppetto.”

PolitiFact.com

A service of the Tampa Bay Times

  • Created August 2007, it uses the “Truth-o-Meter” to rank the amount of truth in public persons’ statements.
  • 2009 Pulitzer Prize Winner.

Snopes.com

Focuses on, but is not limited to, validating and debunking urban legends and other stories in American popular culture.

TruthOrFiction.com

Validates and debunks urban legends, Internet rumors, e-mail forwards, and other stories of unknown or questionable origin.

Further Reading

Last year I wrote a posting in which I picked my personal selection of the top 5 fact checking websites. Not quite the above five, and also it explained not only what they did, but why I personally found them useful.

15 thoughts on “Fact checking the fact checkers”

  1. I don’t know if this comment was addressed to me (I got an email alert saying it was addressed to dave(somebody)). But I have an opinion about it, and since I do not view fact checkers as unbiased and ethical (I think FactCheck.org is typically ethical but not unbiased), it might as well have been directed at me.

    There is a Fallacious Argument often used (Casual Fallacy)…. “Tom lied last year, so therefore his statement today must be false…”

    Right. There is the genetic fallacy and the ad hominem fallacies that essentially follow that pattern.

    Your statement has just verified that you don’t see “fact-checkers” as being unbiased and ethical…

    I’ve made and continue to make such statements. But I don’t say that the fact checkers are biased and unethical therefore don’t believe anything they say. On the contrary, I’ve gone on the record a good number of times (most recently on Quora) to emphasize that biased doesn’t mean wrong. My message on the bias and ethical lapses in fact-checking is that people need to be aware of the bias and ethical failings and take those into account when reading fact checks (including mine, though I hope my ethical failings are less severe than theirs!).

    Also I would hope that conscientious fact-checkers would take those criticisms to heart and try to improve their products.

    Unfortunately their interest in doing so is so far below any baseline I would expect from ethical journalists that I can’t rule out bad faith on their part. The best argument against bad faith is the comical inconsistency they show, which argues for incompetence instead of a well-oiled conspiracy.

    This year’s latest update on the International Fact-Checking Network’s accountability follies:

    https://www.zebrafactcheck.com/fake-accountability-international-fact-checking-network/

    The next installment is in the works, summarizing what I know of the differences between what the IFCN says it does on accountability compared to what it actually appears to do.

    Reply
  2. There is a Fallacious Argument often used (Casual Fallacy)…. “Tom lied last year, so therefore his statement today must be false…” Your statement has just verified that you don’t see “fact-checkers” as being unbiased and ethical… So, who do we believe now?

    Reply
  3. 1) I assume the problems we have identified at PolitiFact are fixable. But we don’t see PolitiFat has interest in fixing them.

    2) There’s not enough data to judge fluctuation in the level of bias at PolitiFact (it’s not an easy thing to measure in the first place). The “Pants on Fire” bias changes but normal variation, regression to the mean and the like, probably explain it as well as anything.

    3) When the International Fact-Checking Network came into being and assume the role of serving as the gatekeeper for Facebook’s stable of fact-checking partners (IFCN verification is a pre-requisite for fact-checking partners), it instituted a system advertised as verifying compliance with various fact-checking principles. Since that effort started I have tried to work within that system while at the same time advocating improved transparency throughout the system.

    When the IFCN started accepting complaints from the public, we took the opportunity to test the system. It appears to work very poorly at present. If the IFCN can be coaxed into improving that system to the point where it can pressure member organization into improving their fact-checking, it may offer a route toward substantial improvements in mainstream fact-checking.

    There are couple of things that seem to be standing in the way. The Poynter Institute owns the International Fact-Checking Network, which in turn is supposed to exercise oversight regarding Poynter-owned PolitiFact. The conflict of interest is obvious, but I’ve always hoped that would work to my advantage because it would pressure Poynter and the IFCN to exercise extreme care in dealing with complaints about PolitiFact. It’s safe to say experience has crushed that hope so far.

    You can read about that here (it’s four pages):

    https://www.zebrafactcheck.com/scandal-fact-checkers-fail-at-accountability/

    It’s not true, by the way, that PolitiFact Bias exists solely to discredit PolitiFact. The hope was always that criticism would help PolitiFact identify and fix problems. But it’s appropriate for PolitiFact to lose credit if it fails to respond to valid criticism with positive changes. Here’s one example of PolitiFact Bias updating an article to praise PolitiFact for making a positive change (one that we called for repeatedly over a period of years):

    https://www.politifactbias.com/2020/02/politifact-updates-its-website-makes.html

    Is that what a site that’s only interested in discrediting PolitiFact would do?

    Years ago, PolitiFact Bias received comments from readers saying it’s easy to criticize/why don’t you do better–that kind of thing. While our criticisms often pointed the way toward better fact-checking, I started Zebra Fact Check not too long after PolitiFact Bias to offer a model for improved fact-checking. Among the things we advocate:

    transparent communications with experts
    interpretation of claims using consistent application of the principle of charitable interpretation
    rating system that relies on objective markers (we established a prototype) or else no rating system
    prominent corrections (As soon as ZFC had a corrections page it appeared prominently on the main menu)
    use of politically polarized teams as a means of helping to neutralize biased blind spots such as confirmation bias (this one’s radioactive for most mainstream journalists, along with the next one)
    staff transparent about ideology
    serious attitude about seeking and executing corrections (every ZFC page features a “Report an Error” button). I’d develop this idea more fully if I had better programming skills (such as numbering complaint tickets and sending confirmation emails that recount the text of the complaint).

    Nobody pays me to do this. I do it because fact-checking needs improvement. And it’s important.

    Reply
    • I appreciated the detailed and thoughtful response Bryan, many thanks for taking the time to do that.

      Reply
  4. When you described PolitiFact as a project of the Times you cited Wikipedia in support. It’s no longer a project of the Times, so your description was off.

    I’m guilty of assuming your description jibed with your source.

    Reply
  5. Tried using my phone to leave a reply, but it didn’t appear–maybe in a moderating queue? In any case, it was brief so I’ll try to cover everything in this reply so there’s no need to publish both.

    1) The Wikipedia entry is actually out of date. PolitiFact is now owned directly by the Poynter Institute. The entry still mentions Eric Ostermeier’s criticism, right? I see it does.
    2) MB/FC uses a subjective system for rating websites. I don’t place in any credence in it. But if you do, you can check out the assessment of Zebra Fact Check (me). High for factual content. Apparently it doesn’t bother Dave VZ that my factual content tends to undercut PolitiFact’s. I’d be looking for a way to resolve the discrepancy were I in his shoes.

    You’re dismissing my criticism of PolitiFact based on the genetic fallacy. You realize that, right?

    Reply
    • Side Note: Comment delays happen because the caching system I use takes a bit of time to sync up. Only blatant spam (stuff that has lots of links) tends to go into a spam queue.

      Serious question (and perhaps going off on a bit of a tangent, but I’m genuinely curious) …

      1. Do you view PolitiFact as totally compromised and beyond redemption, or would you feel that the problem that you perceive is fixable?
      2. Do you think the issue you perceive has remained consistent, is getting worse, or has improved?
      3. If indeed the problem you perceive is fixable, then what strategy do you think could be deployed to fix it
      Reply
  6. For starters, the Wikipedia entry you used is outdated. The Times transferred ownership of PolitiFact to the parent entity, the non-profit Poynter Institute, so it could likewise operate as a non-profit.

    It’s odd to find a skeptic judging things according to a heuristic instead of by looking at the specific evidence.

    More on that later when I have a bit more time.

    Reply
    • For starters, the Wikipedia entry you used is outdated. The Times transferred ownership of PolitiFact to the parent entity, the non-profit Poynter Institute, so it could likewise operate as a non-profit….

      (Opens Wikipedia page and checks again). The very first line reads … “PolitiFact.com is a nonprofit project operated by the Poynter Institute” … the parent entity you refer to that owns both the Tampa Bay Times and also PolitiFact.

      Both are of course run by the same executive … but I guess You are familiar with all such details. To be honest, the transfer was long overdue, I personally think it should have happened a lot sooner.

      To sum up this comment, the Wikipedia article is up to date.

      Reply
  7. The article Sam cited doesn’t do much to prove his point, but then again the Amazeen paper in your post really offers nothing to counter it. Sam’s not wrong. The liberal bias of the fact checkers comes through in claim selection and in slanted narratives. For example, each of the “elite three” fact checkers wrote fact checks downplaying or dismissing the idea that the Affordable Care Act cut Medicare. Why? Because the budgeted number did not go down year by year but instead decreased in relation to the projected rise without the law.

    When Trump administration budgets likewise slowed the growth of Medicaid (admittedly some rare individual years in some of the budgets showed a decline from a preceding year) the fact checkers suddenly developed confidence that cutting from a future baseline really was a cut.

    Maybe they’re not intentionally putting fingers on the scale. Maybe they’re just not significantly better than the rest of us at eliminating bias from our work. But actually looking at the evidence makes it hard to argue the fact checkers do not lean left.

    (before criticizing me for citing a biased website, consider that I’m the one who did the work. The work stands on its merits.)
    https://www.politifactbias.com/p/research.html

    Reply
    • Hi Bryan … politifactbias vs politifact. Which to pick?

      Well, one is a project of the Tampa Bay Times, where multiple reporters and editors from the newspaper and its affiliated news media partners report on the accuracy of statements made by elected officials, candidates, their staffs, lobbyists, interest groups and others involved in U.S. politics.

      • Wikipedia documents it rather well – here
      • MediaBias ranks it very highly as the least biased – see here

      The other is your own personal blog with just one very specific agenda – discredit PolitiFact. Not peer reviewed, not academic, not neutral.

      Reply
    • Indeed yes, Fact checkers do have a rather strong bias against serial liars and gross dishonesty. Guilty as charged.

      Reply
  8. You need more skepticism about Amazeen’s paper. Her research approach set an incredibly low bar for fact checker agreement. She simplified the fact checker ratings down to binary judgments and then used a value for Krippendorff’s alpha below the minimum level Prof. Krippendorff specified (justifying it with what looks like a ghost citation of Kathleen Hall Jameson).

    Chloe Lim’s later study was a modest improvement on Amazeen’s, and with very different conclusions.

    https://journals.sagepub.com/doi/full/10.1177/2053168018786848

    Reply

Leave a Reply