My attempt with the
PolitiFact Truth Index is to sort out some objective measure of the truthiness of statements made by politicians, journalists, activists and advocacy groups. I have shown that the statements made by Republicans tend to be rated false by
PolitiFact and the statements made by Democrats tend to be rated true by PolitiFact. I have jumped to the conclusion that this is because
Republicans lie more than Democrats do. But the whole process of rating statements on a six-point truth scale necessarily involves a selection bias, a subjective process that picks which statements to select for analysis and which to ignore.
There is a universe of statements out there composed of verifiable facts that are significant to some swath of Americans, newsworthy in some way, and likely to be repeated by others. PolitiFact sees its
Truth-o-Meter™ as a somewhat-objective measure of this universe of statements, like dunking a test tube into a lake and analyzing the contents under a microscope back at the lab. This would be highly useful to settle political debates with scientific evidence about the relative truthiness of each party’s statements. But fact-checking isn’t like sampling a lake: each individual would probably have a different definition about what constitutes a newsworthy or significant statement. Even if one assumes that PolitiFact can objectively pass judgment on the truthiness of any given statement, it’s hard to make the case that their statements are fully representative of all political discourse. The lake could be as big as an ocean, and your test tube may not be a representative sample of the water in it.
This is a point made over and over again on the many sites that criticize PolitiFact, sites that are written mostly from a conservative perspective. One of the most thorough smackdowns of PolitiFact since the beginning of this year is the blog
PolitiFact Bias whose writers have a beef not only with the selection bias, but also with the rating process, the final rating itself, and the selection of the “lie of the year”.
In one post, blogger Bryan White tries to make an objective case for partisan bias (“bolstered by methods of science”) by compiling a list of errors in PolitiFact ratings going back to 2008 and pointing out that the overwhelming majority of them are unfavorable to Republicans or favorable to Democrats. But how does he determine which PolitiFact ratings are errors? He used his own (conservative-leaning) judgment, fraught with even more partisan bias than PolitiFact.
The fact is that it’s pretty easy to claim that PolitiFact has a partisan bias based on anecdotal evidence, but it’s pretty much impossible to prove such a claim unless PolitiFact and a bunch of other fact checking websites analyze a given control set of statements that we the public could use to determine who is more biased, who is more nitpicky, etc. (actually, this would be a great idea).
But until that happens, one can only compare PolitiFact with itself. Specifically with the affiliated state sites that are run by different newspaper editorial boards and may exhibit a degree of independence from the national PolitiFact site. I was interested in knowing which newspapers’ editors exhibited the most political bias towards or against one party or the other. So I averaged the PolitiFact Truth Index across each state by political party and came up with the following results.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjXpyShhAuASK1SHpbVjmLXxqvsnocK7ROh8PX4RoyNINABBH6VBVw7EjnKaAKoRgEURYnB981ui_xw1RdWVdtAAmhOHdpoCwcCBsJ2W7R8OEOtwCjCna7LggqO85lyvt3hMfoKAA/s1600/States-dems+only.jpg)
PolitiFact Texas and PolitiFact Ohio both exhibited an average Truth Index value above one standard deviation for Democrats, while PolitiFact Wisconsin showed a negative value for its Democrats (the only state with more Democratic lies than truths since the end of February). It could mean that local and state Democratic politicians in Texas and Ohio really do tell the truth more often than their counterparts in other states, but it could also be a measure of partisan bias towards Democrats by the editorial board of the Cleveland Plain Dealer and the Austin American Statesman.
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiSTyrHqp_G6GrOFOXijSPi7lb398pPiVKpXaV4IpwVrcnb9mKxe1T-Gwee6wQE0cb-Q9yA-vvLe6rQRwrzpiM7nefi1U3dK23-MRM1piHqavYap00n9w3lsrXtgS1UavHNeYLlLQ/s1600/States-GOP+only.jpg)
PolitiFact Georgia and PolitiFact Virginia both exhibited an average Truth Index value above one standard deviation for Republicans, (the only states where Republicans have made more true statements than false ones) while Oregon and Wisconsin’s averages are lower than one standard deviation. Again, this could mean that Georgia’s and Virginia’s Republicans are more honest than everywhere else in the nation. But it could also mean that the Atlanta Journal Constitution and the Richmond Times-Dispatch employ fact-checkers with a Republican bias.
I thought it was interesting that both Wisconsin’s Democratic statements and Republican statements averaged below one standard deviation compared to statements made by fellow Democrats and Republicans in other states. I think this is at least partially due to the heated rhetoric due to the labor fight in that state, and specifically to the local and judicial elections that followed. As I published in
Part 2, statements about elections (workings of government) and personal attacks tend to be rated false across the board, no matter which party makes the statement.
How does the national version of PolitiFact rank against the states in terms of the Truth Index?
![](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhNR5WdL8VaHntxU2YIr2MbV0yRYdpXxuM-f9S3sqtIhO8X3H1v1j0UOOiNHqj4ABCOqKWHASnKEjmjqsmgJzhx1Q83cOSn_M6Rax1qumQjWZw6FBzM843tFolyWDAgceCkRptoSA/s1600/States-versus+national.jpg)
For Democrats, PolitiFact National averages very close to the state average. For Republicans, PolitiFact National scores much lower than the state average. Since the National branch of PolitiFact tends to analyze statements with a national policy scope more than the state branches do, one can look at this fact a couple of different ways. One is that Republicans with a national scope lie more often than their state and local counterparts do. Or, PolitiFact National has an anti-Republican partisan bias. Or perhaps it could be due to some other factor. Maybe Republican pants-on-fire ratings drive more traffic to the website through links on Facebook and liberal blogs, and PolitiFact’s editors could be motivated to sample more of these statements than other types.
The debate about PolitiFact won’t subside, but I do think that the website is a useful mythbusting tool even if it does have a selection bias.
Next: Politicians versus political commentators