Friday, May 20, 2011

Oklahoma Democratic Party Chair Election - With Maps!




It is a long time until we get to 2012. A long time before an electoral map junkie like me gets to wonk out about polls and predictions and electoral votes and whatnot. Fortunately for me, I was selected as a delegate to the Oklahoma Democratic Party's state convention last weekend at the Bricktown Hotel and Convention Center (conveniently located nowhere near Bricktown).

The contest for chair of the party provided that perfect combination of geography and politics that I so crave. Former state representative Wallace Collins (D-Norman) ended up winning the position, but it took three votes and approximately four hours to get him there. After the first vote, a problem with the credentials of the delegates was discovered, which caused a delay of a couple of hours and resulted in a huge amount of business for the hotel's bar. Then once everything was finally worked out and everyone was re-credentialed, a second vote was held, but by then two of the candidates had dropped out. After those votes were tallied, a final runoff was then held between Wallace Collins and Dana Orwig, a favorite of a constituency of delegates calling themselves True Blue Democrats. It was initially announced erroneously that the runoff would be between Collins and Mannix Barnes, a candidate with strong support from the southeastern Oklahoma region known as "Little Dixie", an area that traditionally leans Democratic in state politics.

I kept track of the vote tallies as each county official announced it for all three votes. Then I made my own electoral cartogram of the state with the size of each county roughly proportional to the number of delegates allotted to it. I say "roughly" because I couldn't find exactly how many delegates each county was allotted because of the presence of ex-officio delegates and party officials, so the size of the counties in my maps is determined by the largest of the following: the number of delegates allotted according to the formula in the Oklahoma Democratic Party's constitution (which involves number of votes in statewide elections), or the largest number of delegates to vote in any round of voting for the chair. There are plenty of holes in the map because not every delegate showed up to the convention, including whole counties like Mayes, Sequoyah and the entire panhandle region.

Anyways, here are the maps.







Friday, May 13, 2011

PolitiFact Truth Index, Part 4: Breakdown by Occupation




PolitiFact came into existence prior to the 2008 presidential campaign. It's sole purpose back then was to serve as a check on the claims made by presidential candidates. After the election, PolitiFact changed focus. Now they fact-check policy statements made not only by politicians but also by journalists, rabble-rousers, and anyone whose statements catch the eyes and ears of PolitiFact's writers and editors. Despite branching out into statements made by people who cannot be thrown out of office, the overwhelming majority of claims analyzed still lies in the domain of politicians.




Most of these claims follow the trend that I have compiled here, here and here; that is, Republicans tend to lie more than Democrats do. Below is a chart showing the breakdown of the PolitiFact Truth Index by occupation of the speaker.




Maybe it's not surprising that PolitiFact, an organization of journalists, generally gives fellow journalists good marks (the term "journalist" is used loosely to incorporate not just reporters but also regular opinion columnists and the personalities of the cable news networks). Republican activists tend to lie more than other types of Republicans. And party boosters (basically any organization with "Democrat" or "Republican" in its name, or such an organization's spokesman), whose sole function is to get folks to elect Democrats or Republicans, seem to lie at a shockingly high rate independent of party affiliation.

The biggest surprise is in the category of advocacy group. Since advocacy groups are basically organizations composed of another category, activists, I expected their numbers would be similar. But they're not. Since March Democratic advocacy groups have mostly lied, while Republican advocacy groups have mostly made factual statements. I am interested in seeing if this number changes as we get closer and closer to the 2012 campaign.

As for politicians, PolitiFact grades Democrats a little bit higher than the overall average. The answer to why this is does not lie in the states. There is no real difference between the Truth Index grade for Democratic state legislators and Democrats overall. And the governor number is based on only a single statement; most of the PolitiFact states have Republican governors; my limited database has 35 Republican claims but only one Democratic claim. And an isolated Democratic claim carries very little weight.




The difference between Democratic politicians and Democrats as a whole lies in the numbers for the U.S. Senate. Of the 19 statements made by Democratic senators analyzed by PolitiFact in my database, only three were rated as something other than "true" or "mostly true". That doesn't sound right at all to me.



A large part of this has to do with PolitiFact Ohio's rating of things that Sherrod Brown says. Sherrod Brown is the second most frequently analyzed Democrat after Barack Obama, and the most frequently analyzed member of the U.S. Senate with six statements (2nd place: Rand Paul (4); 3rd place: tied, Saxby Chambliss (3) and Marco Rubio (3)). It's not like Sherrod Brown is one of the more important members of the U.S. Senate; he just happens to represent a state with a PolitiFact branch. Of those six statements, five were rated by PolitiFact Ohio (known to be friendly to Democrats), and all five were rated "true" or "mostly true". The sixth statement was rated by PolitiFact National as "barely true".

And how do the presidential candidates stack up to Barack Obama? Not great.



But on the whole, at least they're not as bad as Donald Trump (-0.60 on 10 statements).

Next: Whatever graphs I have left over!

Friday, May 06, 2011

PolitiFact Truth Index, Part 3: Breakdown by State. Selection Bias?



My attempt with the PolitiFact Truth Index is to sort out some objective measure of the truthiness of statements made by politicians, journalists, activists and advocacy groups. I have shown that the statements made by Republicans tend to be rated false by PolitiFact and the statements made by Democrats tend to be rated true by PolitiFact. I have jumped to the conclusion that this is because Republicans lie more than Democrats do. But the whole process of rating statements on a six-point truth scale necessarily involves a selection bias, a subjective process that picks which statements to select for analysis and which to ignore.

There is a universe of statements out there composed of verifiable facts that are significant to some swath of Americans, newsworthy in some way, and likely to be repeated by others. PolitiFact sees its Truth-o-Meter™ as a somewhat-objective measure of this universe of statements, like dunking a test tube into a lake and analyzing the contents under a microscope back at the lab. This would be highly useful to settle political debates with scientific evidence about the relative truthiness of each party’s statements. But fact-checking isn’t like sampling a lake: each individual would probably have a different definition about what constitutes a newsworthy or significant statement. Even if one assumes that PolitiFact can objectively pass judgment on the truthiness of any given statement, it’s hard to make the case that their statements are fully representative of all political discourse. The lake could be as big as an ocean, and your test tube may not be a representative sample of the water in it.

This is a point made over and over again on the many sites that criticize PolitiFact, sites that are written mostly from a conservative perspective. One of the most thorough smackdowns of PolitiFact since the beginning of this year is the blog PolitiFact Bias whose writers have a beef not only with the selection bias, but also with the rating process, the final rating itself, and the selection of the “lie of the year”. In one post, blogger Bryan White tries to make an objective case for partisan bias (“bolstered by methods of science”) by compiling a list of errors in PolitiFact ratings going back to 2008 and pointing out that the overwhelming majority of them are unfavorable to Republicans or favorable to Democrats. But how does he determine which PolitiFact ratings are errors? He used his own (conservative-leaning) judgment, fraught with even more partisan bias than PolitiFact.

The fact is that it’s pretty easy to claim that PolitiFact has a partisan bias based on anecdotal evidence, but it’s pretty much impossible to prove such a claim unless PolitiFact and a bunch of other fact checking websites analyze a given control set of statements that we the public could use to determine who is more biased, who is more nitpicky, etc. (actually, this would be a great idea).

But until that happens, one can only compare PolitiFact with itself. Specifically with the affiliated state sites that are run by different newspaper editorial boards and may exhibit a degree of independence from the national PolitiFact site. I was interested in knowing which newspapers’ editors exhibited the most political bias towards or against one party or the other. So I averaged the PolitiFact Truth Index across each state by political party and came up with the following results.



PolitiFact Texas and PolitiFact Ohio both exhibited an average Truth Index value above one standard deviation for Democrats, while PolitiFact Wisconsin showed a negative value for its Democrats (the only state with more Democratic lies than truths since the end of February). It could mean that local and state Democratic politicians in Texas and Ohio really do tell the truth more often than their counterparts in other states, but it could also be a measure of partisan bias towards Democrats by the editorial board of the Cleveland Plain Dealer and the Austin American Statesman.



PolitiFact Georgia and PolitiFact Virginia both exhibited an average Truth Index value above one standard deviation for Republicans, (the only states where Republicans have made more true statements than false ones) while Oregon and Wisconsin’s averages are lower than one standard deviation. Again, this could mean that Georgia’s and Virginia’s Republicans are more honest than everywhere else in the nation. But it could also mean that the Atlanta Journal Constitution and the Richmond Times-Dispatch employ fact-checkers with a Republican bias.

I thought it was interesting that both Wisconsin’s Democratic statements and Republican statements averaged below one standard deviation compared to statements made by fellow Democrats and Republicans in other states. I think this is at least partially due to the heated rhetoric due to the labor fight in that state, and specifically to the local and judicial elections that followed. As I published in Part 2, statements about elections (workings of government) and personal attacks tend to be rated false across the board, no matter which party makes the statement.

How does the national version of PolitiFact rank against the states in terms of the Truth Index?



For Democrats, PolitiFact National averages very close to the state average. For Republicans, PolitiFact National scores much lower than the state average. Since the National branch of PolitiFact tends to analyze statements with a national policy scope more than the state branches do, one can look at this fact a couple of different ways. One is that Republicans with a national scope lie more often than their state and local counterparts do. Or, PolitiFact National has an anti-Republican partisan bias. Or perhaps it could be due to some other factor. Maybe Republican pants-on-fire ratings drive more traffic to the website through links on Facebook and liberal blogs, and PolitiFact’s editors could be motivated to sample more of these statements than other types.

The debate about PolitiFact won’t subside, but I do think that the website is a useful mythbusting tool even if it does have a selection bias.

Next: Politicians versus political commentators

Tuesday, May 03, 2011

Politifact Truth Index, Part 2: Breakdown by Subject



Republicans lie. On average, more than Democrats. At least according to the statements analyzed by PolitiFact.com. But do they lie at a uniform rate across any given policy statement? Are there some subject matters where Democrats are more likely to stretch the truth than Republicans are? I had to find out.

I categorized each statement into a fairly broad policy subject based not only on the words of the statement, but also on the point that the statement is trying to make. If the politician is saying something about Medicare, for instance, the intent of the comment might be to urge hospitals to improve care (in which case the statement would be in the Health Care category), or the intent might be to reduce the cost to the government (in which case the statement would be placed in the Government Spending category).

My PolitiFact database breaks each statement down into one of twenty subjects. The distribution of these subjects is shown in the pie chart below.



Recently the biggest subject of discussion has been issues related to government spending as both sides have waged budget battles over the amount of federal money distributed to government programs. Similar subjects have also been popular such as debt/deficit, taxes and the economy. (If a statement refers to both taxes and spending, it is counted as a statement about the debt/deficit).

Labor issues are also popular subjects in this database due to the recent protests in Wisconsin (and also due to the fact that Wisconsin is one of the eight states where PolitiFact partners with a local newspaper for state and local fact-checking). Labor issues incorporate collective bargaining issues, statements made about the benefits of public employees, and statements made about unions.

Which types of statements are usually true across all parties? Government spending issues. Both Republicans and Democrats score highly in the PolitiFact Truth Index when it comes to issues relating to government spending, which is great because government spending issues are the largest category of statements. Politicians can be generally trusted when it comes to statements made about government spending.

Second Amendment issues also score highly across both parties, but only nine statements (five by Republicans, three by Democrats) have been recorded in my database. The dataset isn't high, so the numbers are subject to wild changes. Public safety issues (crime statistics mostly) also score highly, although there has only been one identifiably Republican statement made in regard to this subject. Most public safety statements are made by local police officers or other non-partisan figures.

Which types of statements are usually lies? Personal attacks or boasts are more often false than true no matter which party makes the claim. Personal attacks are most often made in the heat of campaigns and encompass claims about an opponent's background or some statement he or she made in the past. It is the category of all the birther claims, but also of the accusations that certain politicians said they wanted to get rid of Medicare or other government programs.

Statements about the workings of government are also usually lies across all parties. This category includes statements about elections, including polls, demography, gerrymandering, vote counting and allegations of fraud; and it also includes statements about congressional rules and bureaucracy. Don't trust a politician when he/she says something like a recent poll shows him/her statistically tied with Obama for the 2012 presidency.

Which types of statements show Republicans lying as Democrats tell the truth? Labor issues, taxes, Obamacare, social issues, and environmental issues. Democrats can point to these subjects as areas where Democrats are on the right side of the truth and Republicans are on the wrong side, usually.

Are there any subjects where Democrats lie as Republicans tell the truth? So far just one: transportation. And this cannot be relied on as there have only been two Republican and four Democratic statements made.

Here's the chart:



Here's the main point - Republicans do score better than Democrats in some sizeable subjects, namely government spending and education. But Democrats do better in almost everything else: the economy, labor issues, the workings of government, debt/deficit, taxes, personal attacks, Obamacare, social issues, energy issues, environmental issues, and health care issues. The difference is vast when it comes to social issues (abortion, gay rights), environmental issues, Obamacare and taxes. When it comes to these policy issues, one can not only not trust Republicans, but also one can usually trust Democrats.

Next: PolitiFact's state newspaper affiliates: are there biases?