Friday, May 06, 2011

PolitiFact Truth Index, Part 3: Breakdown by State. Selection Bias?

My attempt with the PolitiFact Truth Index is to sort out some objective measure of the truthiness of statements made by politicians, journalists, activists and advocacy groups. I have shown that the statements made by Republicans tend to be rated false by PolitiFact and the statements made by Democrats tend to be rated true by PolitiFact. I have jumped to the conclusion that this is because Republicans lie more than Democrats do. But the whole process of rating statements on a six-point truth scale necessarily involves a selection bias, a subjective process that picks which statements to select for analysis and which to ignore.

There is a universe of statements out there composed of verifiable facts that are significant to some swath of Americans, newsworthy in some way, and likely to be repeated by others. PolitiFact sees its Truth-o-Meter™ as a somewhat-objective measure of this universe of statements, like dunking a test tube into a lake and analyzing the contents under a microscope back at the lab. This would be highly useful to settle political debates with scientific evidence about the relative truthiness of each party’s statements. But fact-checking isn’t like sampling a lake: each individual would probably have a different definition about what constitutes a newsworthy or significant statement. Even if one assumes that PolitiFact can objectively pass judgment on the truthiness of any given statement, it’s hard to make the case that their statements are fully representative of all political discourse. The lake could be as big as an ocean, and your test tube may not be a representative sample of the water in it.

This is a point made over and over again on the many sites that criticize PolitiFact, sites that are written mostly from a conservative perspective. One of the most thorough smackdowns of PolitiFact since the beginning of this year is the blog PolitiFact Bias whose writers have a beef not only with the selection bias, but also with the rating process, the final rating itself, and the selection of the “lie of the year”. In one post, blogger Bryan White tries to make an objective case for partisan bias (“bolstered by methods of science”) by compiling a list of errors in PolitiFact ratings going back to 2008 and pointing out that the overwhelming majority of them are unfavorable to Republicans or favorable to Democrats. But how does he determine which PolitiFact ratings are errors? He used his own (conservative-leaning) judgment, fraught with even more partisan bias than PolitiFact.

The fact is that it’s pretty easy to claim that PolitiFact has a partisan bias based on anecdotal evidence, but it’s pretty much impossible to prove such a claim unless PolitiFact and a bunch of other fact checking websites analyze a given control set of statements that we the public could use to determine who is more biased, who is more nitpicky, etc. (actually, this would be a great idea).

But until that happens, one can only compare PolitiFact with itself. Specifically with the affiliated state sites that are run by different newspaper editorial boards and may exhibit a degree of independence from the national PolitiFact site. I was interested in knowing which newspapers’ editors exhibited the most political bias towards or against one party or the other. So I averaged the PolitiFact Truth Index across each state by political party and came up with the following results.

PolitiFact Texas and PolitiFact Ohio both exhibited an average Truth Index value above one standard deviation for Democrats, while PolitiFact Wisconsin showed a negative value for its Democrats (the only state with more Democratic lies than truths since the end of February). It could mean that local and state Democratic politicians in Texas and Ohio really do tell the truth more often than their counterparts in other states, but it could also be a measure of partisan bias towards Democrats by the editorial board of the Cleveland Plain Dealer and the Austin American Statesman.

PolitiFact Georgia and PolitiFact Virginia both exhibited an average Truth Index value above one standard deviation for Republicans, (the only states where Republicans have made more true statements than false ones) while Oregon and Wisconsin’s averages are lower than one standard deviation. Again, this could mean that Georgia’s and Virginia’s Republicans are more honest than everywhere else in the nation. But it could also mean that the Atlanta Journal Constitution and the Richmond Times-Dispatch employ fact-checkers with a Republican bias.

I thought it was interesting that both Wisconsin’s Democratic statements and Republican statements averaged below one standard deviation compared to statements made by fellow Democrats and Republicans in other states. I think this is at least partially due to the heated rhetoric due to the labor fight in that state, and specifically to the local and judicial elections that followed. As I published in Part 2, statements about elections (workings of government) and personal attacks tend to be rated false across the board, no matter which party makes the statement.

How does the national version of PolitiFact rank against the states in terms of the Truth Index?

For Democrats, PolitiFact National averages very close to the state average. For Republicans, PolitiFact National scores much lower than the state average. Since the National branch of PolitiFact tends to analyze statements with a national policy scope more than the state branches do, one can look at this fact a couple of different ways. One is that Republicans with a national scope lie more often than their state and local counterparts do. Or, PolitiFact National has an anti-Republican partisan bias. Or perhaps it could be due to some other factor. Maybe Republican pants-on-fire ratings drive more traffic to the website through links on Facebook and liberal blogs, and PolitiFact’s editors could be motivated to sample more of these statements than other types.

The debate about PolitiFact won’t subside, but I do think that the website is a useful mythbusting tool even if it does have a selection bias.

Next: Politicians versus political commentators


Bryan said...

"He used his own (conservative-leaning) judgment, fraught with even more partisan bias than PolitiFact."

For example?

Concrete examples are the bread and butter of systematic analysis.

Bryan said...

Steve, I can't seem to find any evidence that you looked at any of the assessments of PolitiFact hosted at the Sublime Bloviations site. If you passed judgment on my writing without having read it then I'd like to see you add an explanatory comment so that your readers do not place undue faith in your assessment.

You'd certainly be fair in noting my selection bias, since I admit to it repeatedly and conspicuously. The difference is that I discourage readers from using my sample as though it is random. As you've noted, PolitiFact doesn't bother making any such disclaimer. You've also misrepresented my post by conflating the method I say is bolstered by the methods of science with my subsequent presentation of my list of assessments. I do not claim to have accomplished the method I described with that list. The list is a drop in the bucket. It's evidence, but relatively weak evidence.

I work under no illusion that the list represents definitive evidence of a systemic bias at PolitiFact. But the number of times PolitiFact's grades go easy on Democrats and tough on Republicans does count as an important and legitimate evidence supporting (not definitively) the charge of bias.

Steve said...

Bryan, thanks for commenting! It’s good to know that there are other people in this world who have even more of an obsession with PolitiFact rankings than I do. :)

The subject of your post on PolitiFact Bias that I linked to was “anecdotes and their role in helping to show bias in a body of work.” The method that you said was “bolstered by the methods of science” involved counting the errors and seeing which political party is helped by the error and which one is hurt by it. And while you do repeatedly note that you have a selection bias when it comes to noticing anti-Republican and pro-Democratic errors, you never note that the judgment used to classify PolitiFact’s statements as errors is entirely subjective. Your own list could never be used as a drop in the bucket of evidence showing a partisan bias at PolitiFact because it is less a list of “errors” and more a list of “stuff you don’t like”.

As an example, the most recent entry from that list in March was a rebuke of PolitiFact’s assignment of a ranking of “true” to a statement made by Michael Moore that said that 400 Americans were wealthier than half of all Americans combined. Basically you said that this statement wasn’t true because Moore did not specify which half of Americans he was talking about, and that PolitiFact ought to hold the statement to a higher standard of specific wording. I think most people would understand that the 400 wealthiest Americans were being compared to the less wealthy half of all Americans, and only those who were looking for errors (such as you) could interpret that statement in any other way.

As another example, the second-most recent entry on PolitiFact Bias was about PolitiFact Ohio’s rating of “false” to a statement made by Rob Portman that said that expanded oil exploration and drilling would “immediately reduce our dangerous dependence on foreign oil.” While Red State concluded that the term “immediately” wasn’t very important because it’s still good policy, you went one further and claimed that even if the term “immediately” was important, the claim should be rated “barely true” or higher because expanded drilling would immediately reduce the political power of OPEC. Whether or not the power of OPEC would be diminished immediately is arguable either way, but you certainly can’t claim that the United States would be able to use less foreign oil immediately because of a change in oil exploration policy, which is what PolitiFact was checking. And PolitiFact was able to check this claim because it involves a quantitative argument (the amount of foreign oil consumed) rather than a qualitative argument (the amount of OPEC’s political power).

Bryan, I enjoy reading your site. Your writing is of a higher quality than mine, your thoughts are actually based on an amount of reason (even though it’s conservative-leaning), and you don’t actually bloviate nearly as much as most Conservatives do. :)

Steve said...

And I should have said something more along the lines of "fraught with the potential for even more partisan bias than PolitiFact." Their site has independent editors, while yours doesn't (just like mine). But since I can't quantify the bias, I shouldn't assume.

Bryan said...

Meh. Stupid interface ate my first reply attempt.

Bryan said...

1) My list is a list of PolitiFact's errors.

2) The Moore example is an example of PolitiFact error via their uneven application of standards (ignoring their criteria for precision and granting Moore charitable interpretation where they fail to do so in other cases).

3) Portman specifically alluded to "dangerous" dependence on foreign oil. Dependence on Saudi oil isn't dangerous when the Saudi's can increase the production at will to prevent economy-damaging energy price spikes. It's very dangerous when the Saudis, as part of OPEC, decide we're over a barrel to the point they can cut us off in order to obtain political concessions.

It's not my fault that PolitiFact decided to take what may have been a statement of opinion by Portman and filtered out the danger in order to turn it into something they could fact check. You ought to be asking how they justify communicating to their readers that the opinions of others are wrong while avoiding giving their own opinions.

Bryan said...

And I should have said something more along the lines of "fraught with the potential for even more partisan bias than PolitiFact."

Yeah, I caught that. ;-)

Good thing I'm better at fact checking than they are. :-)

Their site has independent editors, while yours doesn't (just like mine). But since I can't quantify the bias, I shouldn't assume.

What's an "independent editor"? Somebody who isn't the author?

If PolitiFact follows the procedures they claim to follow, then at least three editors read where Grover Norquist tweeted "Withheld union dues fund half of Dem (Democratic) campaigns in Florida" and all of them assumed him to mean that half the funding for all the Democratic campaigns in Florida came from withheld union dues. This suggests that we shouldn't be too impressed with independent editors. The fact is they live and breath in a newsroom culture that makes them trust their writers a bit too much and makes their own outlook on things a bit too comfortably homogeneous.

Steve said...

1) Your list is actually a list of your opinion of PolitiFact's errors. I do not agree with your assessment that they are errors. To be an error, a statement has to be compared to an objectively true result, something everyone can agree about (the spelling of a name, the amount of the public share of federal debt, the number of barrels of oil imported into the United States last year, etc.)

2) The Moore example is an example of you using the strictest application of language when it suits your purposes. Contrast this to the Portman example, where you actually ignore part of the language in order to make your point. You're telling me that Moore deserves less credit because he didn't specify which half of Americans he was talking about, but Portman deserves more credit because he didn't really mean to say "immediately"?

3) Portman claimed that the dependence on foreign oil would be "immediately" reduced. Whether or not the dependence is "dangerous" is completely besides the point, and it was certainly not the claim being checked by PolitiFact. You ought to be asking yourself how you can call these things "errors" when you're not even looking at the right part of the claim being debated.

Steve said...

And I'll admit, I should just have said "editor" instead of "independent editor", which I meant as a way of saying "someone who's not the author".

See, I appreciated your stance on the Grover Norquist fact check. The language is too confusing to fact check.

Bryan said...

1) Inconsistency is an error. That means if PolitiFact does it one way one time and another way another time it is an error. Failure to follow one's own standards constitutes an error. You don't appear to take that into account.

2) I don't grade the statements myself. I simply note what we ought to conclude if PolitiFact's standards applied evenly. PolitiFact gave Moore a "True" based on charitable interpretation and gave Portman a "False" while withholding charitable interpretation. Where charitable interpretation is not applied evenly in objective reporting, an error has occurred.

3) Why are we assuming that PolitiFact checked the right claim?

Bryan said...

Re: Norquist

Coincidentally or not, if one takes Norquist to refer to half of all Democratic campaigns receiving *some* support from union dues withheld through state policy, the claim checks out.

As I've said on more than one occasion, everyone deserves charitable interpretation (even Michael Moore).

Steve said...

1) I'm not even going to agree with you that PolitiFact is being inconsistent, at least not with the two examples I used earlier today. Context and intent guide the process in both regards.

2) "I don't grade the statements myself." Sometimes you do. In the Portman case, you offered that the statement should be at least "barely true". That sounds like a grade to me.

In neither the Moore case nor the Portman case was a "charitable interpretation" necessary. I think "common sense interpretation" is more of an accurate term. I don't have any doubt that Moore was talking about the least wealthy half of Americans, so I don't have to give him the benefit of the doubt to be able to interpret his statement. Similarly...

3) Why are you assuming that you checked the right claim? The Portman fact-check was less about Portman and more about a Republican talking point that gets repeated over and over, which is that expanded oil exploration would immediately reduce our need for foreign oil imports. I don't have any doubt that the "immediate" part of Portman's statement was part of the salesman-like pitch for drilling, which is why it is a necessary element to the claim.

I do have a doubt that Norquist was talking about unions putting up half the fundraising dollars of all Democrats in Florida combined, which is why I agree with your interpretation for that fact-check.

Bryan said...

1) Agree with me or not, you can't reasonably assess my objectivity without reaching some sort of determination on that point. If your determination is subjective then you undercut your argument about my lack of objectivity. I will eagerly discuss the issue in terms of evidence.

2) The grade a suggested for Portman agrees with my statement that I assess in terms of the PolitiFact system (which I think is terminally flawed). Were I offering my own grade I might well say Portman was "Absolutely Correct." Your statement on charitable interpretation is self-contradictory. I'll be happy to explain if you don't see it.

3) It's bad form to answer a question with a question. I grade PolitiFact on whether it operates consistently, and that includes taking them to task over things like arbitrarily deciding what fact to check in a given statement. Where the standard varies in objective reporting, an error has occurred. So, in answer to your question (with its fallacious complexity), I do not assume which fact is to be checked. I assess claims according to every interpretation I can think of and subsequently evaluate which represent likely intent and which are plausible/implausible. My methods are outstandingly consistent compared to PolitiFact's.

Steve said...

I am assessing your objectivity by stating that your judgment is subjective. Which is basically my whole point today, that you have an unacknowledged partisan bias when it comes to claiming errors in PolitiFact's methods. You acknowledge your partisan selection bias, but you don't acknowledge your bias when it comes to determining whether or not PolitiFact is in error. It's a determination that cannot be made objectively, and the fact that you can't acknowledge it is making me want to take back the good things I said earlier about your ability to reason.

Bryan said...

I am assessing your objectivity by stating that your judgment is subjective.

Sounds like your conclusion precedes the evidence, based on that statement.

you have an unacknowledged partisan bias when it comes to claiming errors in PolitiFact's methods.

I completely acknowledge my bias. What I do not acknowledge (and you give me no reason to do so) is that my bias significantly affects my fault-finding wrt PolitiFact.

Maybe if you could present evidence of it without flubbing up?

(Y)ou don't acknowledge your bias when it comes to determining whether or not PolitiFact is in error

Bias is irrelevant in determining whether PolitiFact errs. At that point it is an issue of whether or not I am correct, not whether I am biased. As I've already said, I am always eager to discuss that issue based on the facts.

It's a determination that cannot be made objectively

Why not?

(T)he fact that you can't acknowledge it is making me want to take back the good things I said earlier about your ability to reason.

That sounds reasonable. ;-)

Steve said...

Too bad Blogger ate the other ten comments, apparently.

Bryan said...


Sorry I missed your replies, and I miss my part of the conversation as well. Been busy, but I'll get around to reiterating a few things.

Bryan said...

Heh. I'm glad I dragged my feet on getting back here to restate my case. Blogger finally found those lost comments.

I look forward to your reply, Steve. Perhaps this post will bring your attention back to the discussion, particularly the question as to why PolitiFact errors cannot be detected objectively.

Bryan said...

So are you on vacation or what?