This week PolitiFact announced a new feature on their mobile phone app. They developed a chart of general truthiness against time so that trends could be seen more easily. In order to do this, they had to create some way of quantifying their own Truth-o-meter® ratings, so they assigned values to their ratings with positive numbers representing truths and negative numbers representing lies. Then they took the average of the last seven days and charted the results.
The feature is almost exactly like my PolitiFact Truth Index.
What did PolitiFact call this new feature? The PolitiFact Truth Index.
Coincidence? Who knows.
So now when you search for "politifact truth index" in Google, PolitiFact's site comes up first rather than mine. On the other hand, now that the PolitiFact Truth Index is actually a thing, maybe auxiliary search traffic to this site will increase.
Not that any of that actually matters.
There are differences between our methods, PolitiFact and me, but most of them are arcane, with one big exception: PolitiFact's PolitiFact Truth Index doesn't break it down by political party. This is probably in an attempt to avoid any overt semblance of partisanship, because otherwise PolitiFact would have to defend the reason why Democrats have a higher Truth Index value than Republicans, and then they get into the whole debate about selection bias versus actual lying, and then they lose their non-partisan aura which is essential to their reputation as a source for the mainstream media.
I obviously don't have any of those qualms. I just charts 'em as I sees 'em, and I see Republicans lying (although recently the numbers have been evening out, as shown below).
Other differences between the Quibbling Potatoes PolitiFact Truth Index and the PolitiFact PolitiFact Truth Index have more to do with the calculation of points. My points for the six categories of statements in descending truth order are as follows: 1, 1, 0, 0, -1, -1. PolitiFact's are essentially as follows: 1, 0.5, 0, -0.5, -1, -1.5. They also multiply the result by 100 because it just looks cooler that way. And their average is calculated by date that they analyzed the statement, not by the date that the statement was made. So even though PolitiFact claims to be tracking the "ups and downs of political discourse" with its Truth Index, they are really tracking the ups and downs of PolitiFact's choice of statements for the week. There are differences between those two methods as you can see by comparing the chart below to the one above.
PolitiFact's method is so similar to mine that it was really easy to calculate and make a chart based on one of the other methods of quantification that I tried which was based on a 5-4-3-2-1-0 point system. So the chart below represents what PolitiFact's PolitiFact Truth Index would be if they broke their numbers down by political party.
For the week ending July 1, 2011, Republicans had an Index value of -30.43 and Democrats had an Index value of -25.00.
PolitiFact's method makes everyone look a little worse, which is to be expected when three out of the six categories essentially are downgraded by half a point so that Pants-on-fire ratings are 50% worse on the bad side than True ratings are on the good side. But it still shows that, but for the week ending on May 20, 2011, Democrats have scored better than Republicans in every week I've looked at so far.