Friday, February 10, 2012

Income Tax Cuts - Mary Fallin's War on the Poor


It has been an absolutely wacky first week of the Oklahoma legislature's 2012 session. Foods containing aborted fetuses? Personhood definitions? This cake featuring Ronald Reagan's face and an extra "W"? This declaration that "any action in which a man ejaculates or otherwise deposits semen anywhere but in a woman's vagina" would be "an action against an unborn child"? This isn't even counting the stuff going down at Night Trips' newest location, the insurance commissioner's office.

But maybe the most important news from this week was officially unveiled at Governor Mary Fallin's State of the State address. Fallin introduced a plan to reduce the Oklahoma state income tax drastically, from a current top rate of 5.25% down to a rate of 3.5% for individuals making more than $35,000 per year or couples making more than $70,000 per year. Fallin said that a family making $40,000 per year would pay 37% less in taxes in 2013. Amazing, right? So, how will we pay for it?

The official word is that the $100,000,000 revenue-deficiency in 2013 and the $300,000,000 revenue-deficiency in 2014 would be made up with the elimination of "loopholes", "credits", "carve-outs", and other words for things that unscrupulous businesses and rich people with personal accountants take advantage of. But, as it turns out, just doing the transferable tax credit stuff still means that the state is "woefully short" on making up the revenue lost. So that means eliminating a bunch of other tax credit programs that normal people use, like the personal exemption (claimed by 83% of tax payers), the earned income tax credit, the child care tax credit, a tax credit used to offset the sales tax for groceries for poor people, et cetera.

As it turns out, Mary Fallin's tax reform proposal would actually result in tax increases for the bottom 55% of income earners, according to the Oklahoma Policy Institute. Now that's broadening the base!

Other ways to pay for Fallin's tax reform proposal, according to Fallin, would be that as a result of shoveling fat stacks of Benjamins into the pockets of the Aubrie McClendons of the state, the state's economy would grow fast enough to overcome the revenue shortfall. "New jobs and increased investments in Oklahoma will lead to more revenue and increased collections in sales tax, corporate tax, excise tax, and more." Tax cuts pay for themselves, y'all. If this smells exactly like an Arthur Laffer supply-side voodoo economics argument-turd, it's because Arthur Laffer's consulting firm helped create the plan with the right-wing think tank Oklahoma Council on Public Affairs.

Oklahoma's economy has grown quite fast since the 2008-2009 recession in part due to our booming energy industry and in part because, well, nowhere to go but up, right? This should mean that bond ratings agencies would look more favorably at the credit rating of the state and its ability to pay back its bonds. But today Moody's announced that it was keeping the credit rating of the state steady, partly because of the tax reform plan, but mostly because our constitution makes it really hard to raise taxes.

From the Tulsa World:

The Moody's report says the state's financial situation is made stronger by a strong state constitutional balanced-budget requirement, a healthy state balance sheet, and substantial oil and gas reserves.

But the agency expressed concerns about the state's past and future plans to cut income taxes.

"The trend of decreasing income tax rates combined with the difficulty in increasing taxes constricts future financial flexibility," the Moody's report says.


So when bad economic times come again, tax reform will cripple the state, and we will be unable to afford many of the functions of government that so many people, at least in part, will depend on to offset the cost of having to pay higher taxes due to said tax reform.

Thursday, July 07, 2011

PolitiFact Truth Index, Part 5: That Other PolitiFact Truth Index

This week PolitiFact announced a new feature on their mobile phone app. They developed a chart of general truthiness against time so that trends could be seen more easily. In order to do this, they had to create some way of quantifying their own Truth-o-meter® ratings, so they assigned values to their ratings with positive numbers representing truths and negative numbers representing lies. Then they took the average of the last seven days and charted the results.

The feature is almost exactly like my PolitiFact Truth Index.

What did PolitiFact call this new feature? The PolitiFact Truth Index.

Coincidence? Who knows.

So now when you search for "politifact truth index" in Google, PolitiFact's site comes up first rather than mine. On the other hand, now that the PolitiFact Truth Index is actually a thing, maybe auxiliary search traffic to this site will increase.

Not that any of that actually matters.

There are differences between our methods, PolitiFact and me, but most of them are arcane, with one big exception: PolitiFact's PolitiFact Truth Index doesn't break it down by political party. This is probably in an attempt to avoid any overt semblance of partisanship, because otherwise PolitiFact would have to defend the reason why Democrats have a higher Truth Index value than Republicans, and then they get into the whole debate about selection bias versus actual lying, and then they lose their non-partisan aura which is essential to their reputation as a source for the mainstream media.

I obviously don't have any of those qualms. I just charts 'em as I sees 'em, and I see Republicans lying (although recently the numbers have been evening out, as shown below).




Other differences between the Quibbling Potatoes PolitiFact Truth Index and the PolitiFact PolitiFact Truth Index have more to do with the calculation of points. My points for the six categories of statements in descending truth order are as follows: 1, 1, 0, 0, -1, -1. PolitiFact's are essentially as follows: 1, 0.5, 0, -0.5, -1, -1.5. They also multiply the result by 100 because it just looks cooler that way. And their average is calculated by date that they analyzed the statement, not by the date that the statement was made. So even though PolitiFact claims to be tracking the "ups and downs of political discourse" with its Truth Index, they are really tracking the ups and downs of PolitiFact's choice of statements for the week. There are differences between those two methods as you can see by comparing the chart below to the one above.



PolitiFact's method is so similar to mine that it was really easy to calculate and make a chart based on one of the other methods of quantification that I tried which was based on a 5-4-3-2-1-0 point system. So the chart below represents what PolitiFact's PolitiFact Truth Index would be if they broke their numbers down by political party.



For the week ending July 1, 2011, Republicans had an Index value of -30.43 and Democrats had an Index value of -25.00.

PolitiFact's method makes everyone look a little worse, which is to be expected when three out of the six categories essentially are downgraded by half a point so that Pants-on-fire ratings are 50% worse on the bad side than True ratings are on the good side. But it still shows that, but for the week ending on May 20, 2011, Democrats have scored better than Republicans in every week I've looked at so far.

Monday, June 27, 2011

Michele Bachmann's No Flake, Unfortunately


With merely 16 months to go before the general election in 2012, the field of Republican presidential candidates is coalescing into a smorgasbord of eight or nine hopefuls who will achieve outsized media attention for the next six months before six or seven of them will drop out of the race. One of the strongest candidates, at least according to a poll in Iowa, is Minnesota Representative Michele Bachmann. But despite the recent media praise and favorable poll numbers, Chris Wallace of Fox News asked her on his Sunday show if she was "a flake". The prelude to this question was that Bachmann had a well-known reputation for, let's say, stretching the truth into a fact-free porridge of inventive talking points. But my main issue is with the definition of the term "flake", and why an insane-yet-committed Republican would be thought of as a flake.

Bachmann took immediate issue with Chris Wallace's question, calling it "insulting" and later stating how serious she was. And, for once, I have no reason to mistrust Bachmann. The Republican Party has been splitting for the last two years between moderate conservatives and the more right-wing Tea Party conservatives. With the ascendance of the Tea Party, the power center of the Republican Party as a whole has continued to veer right. Ten years ago, someone who was known for speculation that the dollar would be replaced by a multinational currency, who speculated that the Carter administration may have been responsible for a swine flu outbreak, who spread paranoia about the U.S. census, and who was responsible for some of the most egregious lies about the health care system may not have been able to speak for the Republican Party. But the times, they are a-changin'.

But this is all beside the point. Isn't a "flake" someone who sometimes doesn't show up to events they are expected to be at? Given a choice between candidate A who is spending her first official day on the campaign trail in Iowa, New Hampshire and South Carolina and made appearances on two national Sunday talk shows the day before her announcement; or candidate B, who is going to skip out on the Iowa caucuses and took a pass on the first real Republican debate: wouldn't candidate B look like the huge flake? How about candidate C who, despite being the assumed front-runner and who was the wealthiest candidate in 2008, will skip the Iowa straw poll in August because it is "an expensive proposition"?

But then, maybe I'm wrong in my interpretation of the term "flake". I was listening to NPR at lunch, and on Here and Now, Jay Newton-Small, congressional correspondant for Time Magazine, stated that Michele Bachmann was going to try to "get away from that bomb-throwing, that flakey type of reputation that she has." Wait, since when does bomb-throwing have anything to do with being a flake? The only connection that comes to my mind was H. Ross Perot in 1992, a bomb-thrower who later went on to quit his campaign despite pretty good poll numbers for a third-party candidate. And unfortunately, I think Michele Bachmann has more staying power than H. Ross Perot.

So now I'm confused. What does "flake" mean again? It's time I turned to that most esteemed and venerated source of terminology: Urban Dictionary. Aha! A flake is "an unreliable person; someone who agrees to do something, but never follows through." I thought so!

Chris Wallace later went on to apologize for his "flake" question for being too insulting. He did not, however, apologize for misinterpreting what a flake is. I'm still waiting, Mr. Wallace.

Friday, May 20, 2011

Oklahoma Democratic Party Chair Election - With Maps!




It is a long time until we get to 2012. A long time before an electoral map junkie like me gets to wonk out about polls and predictions and electoral votes and whatnot. Fortunately for me, I was selected as a delegate to the Oklahoma Democratic Party's state convention last weekend at the Bricktown Hotel and Convention Center (conveniently located nowhere near Bricktown).

The contest for chair of the party provided that perfect combination of geography and politics that I so crave. Former state representative Wallace Collins (D-Norman) ended up winning the position, but it took three votes and approximately four hours to get him there. After the first vote, a problem with the credentials of the delegates was discovered, which caused a delay of a couple of hours and resulted in a huge amount of business for the hotel's bar. Then once everything was finally worked out and everyone was re-credentialed, a second vote was held, but by then two of the candidates had dropped out. After those votes were tallied, a final runoff was then held between Wallace Collins and Dana Orwig, a favorite of a constituency of delegates calling themselves True Blue Democrats. It was initially announced erroneously that the runoff would be between Collins and Mannix Barnes, a candidate with strong support from the southeastern Oklahoma region known as "Little Dixie", an area that traditionally leans Democratic in state politics.

I kept track of the vote tallies as each county official announced it for all three votes. Then I made my own electoral cartogram of the state with the size of each county roughly proportional to the number of delegates allotted to it. I say "roughly" because I couldn't find exactly how many delegates each county was allotted because of the presence of ex-officio delegates and party officials, so the size of the counties in my maps is determined by the largest of the following: the number of delegates allotted according to the formula in the Oklahoma Democratic Party's constitution (which involves number of votes in statewide elections), or the largest number of delegates to vote in any round of voting for the chair. There are plenty of holes in the map because not every delegate showed up to the convention, including whole counties like Mayes, Sequoyah and the entire panhandle region.

Anyways, here are the maps.







Friday, May 13, 2011

PolitiFact Truth Index, Part 4: Breakdown by Occupation




PolitiFact came into existence prior to the 2008 presidential campaign. It's sole purpose back then was to serve as a check on the claims made by presidential candidates. After the election, PolitiFact changed focus. Now they fact-check policy statements made not only by politicians but also by journalists, rabble-rousers, and anyone whose statements catch the eyes and ears of PolitiFact's writers and editors. Despite branching out into statements made by people who cannot be thrown out of office, the overwhelming majority of claims analyzed still lies in the domain of politicians.




Most of these claims follow the trend that I have compiled here, here and here; that is, Republicans tend to lie more than Democrats do. Below is a chart showing the breakdown of the PolitiFact Truth Index by occupation of the speaker.




Maybe it's not surprising that PolitiFact, an organization of journalists, generally gives fellow journalists good marks (the term "journalist" is used loosely to incorporate not just reporters but also regular opinion columnists and the personalities of the cable news networks). Republican activists tend to lie more than other types of Republicans. And party boosters (basically any organization with "Democrat" or "Republican" in its name, or such an organization's spokesman), whose sole function is to get folks to elect Democrats or Republicans, seem to lie at a shockingly high rate independent of party affiliation.

The biggest surprise is in the category of advocacy group. Since advocacy groups are basically organizations composed of another category, activists, I expected their numbers would be similar. But they're not. Since March Democratic advocacy groups have mostly lied, while Republican advocacy groups have mostly made factual statements. I am interested in seeing if this number changes as we get closer and closer to the 2012 campaign.

As for politicians, PolitiFact grades Democrats a little bit higher than the overall average. The answer to why this is does not lie in the states. There is no real difference between the Truth Index grade for Democratic state legislators and Democrats overall. And the governor number is based on only a single statement; most of the PolitiFact states have Republican governors; my limited database has 35 Republican claims but only one Democratic claim. And an isolated Democratic claim carries very little weight.




The difference between Democratic politicians and Democrats as a whole lies in the numbers for the U.S. Senate. Of the 19 statements made by Democratic senators analyzed by PolitiFact in my database, only three were rated as something other than "true" or "mostly true". That doesn't sound right at all to me.



A large part of this has to do with PolitiFact Ohio's rating of things that Sherrod Brown says. Sherrod Brown is the second most frequently analyzed Democrat after Barack Obama, and the most frequently analyzed member of the U.S. Senate with six statements (2nd place: Rand Paul (4); 3rd place: tied, Saxby Chambliss (3) and Marco Rubio (3)). It's not like Sherrod Brown is one of the more important members of the U.S. Senate; he just happens to represent a state with a PolitiFact branch. Of those six statements, five were rated by PolitiFact Ohio (known to be friendly to Democrats), and all five were rated "true" or "mostly true". The sixth statement was rated by PolitiFact National as "barely true".

And how do the presidential candidates stack up to Barack Obama? Not great.



But on the whole, at least they're not as bad as Donald Trump (-0.60 on 10 statements).

Next: Whatever graphs I have left over!

Friday, May 06, 2011

PolitiFact Truth Index, Part 3: Breakdown by State. Selection Bias?



My attempt with the PolitiFact Truth Index is to sort out some objective measure of the truthiness of statements made by politicians, journalists, activists and advocacy groups. I have shown that the statements made by Republicans tend to be rated false by PolitiFact and the statements made by Democrats tend to be rated true by PolitiFact. I have jumped to the conclusion that this is because Republicans lie more than Democrats do. But the whole process of rating statements on a six-point truth scale necessarily involves a selection bias, a subjective process that picks which statements to select for analysis and which to ignore.

There is a universe of statements out there composed of verifiable facts that are significant to some swath of Americans, newsworthy in some way, and likely to be repeated by others. PolitiFact sees its Truth-o-Meter™ as a somewhat-objective measure of this universe of statements, like dunking a test tube into a lake and analyzing the contents under a microscope back at the lab. This would be highly useful to settle political debates with scientific evidence about the relative truthiness of each party’s statements. But fact-checking isn’t like sampling a lake: each individual would probably have a different definition about what constitutes a newsworthy or significant statement. Even if one assumes that PolitiFact can objectively pass judgment on the truthiness of any given statement, it’s hard to make the case that their statements are fully representative of all political discourse. The lake could be as big as an ocean, and your test tube may not be a representative sample of the water in it.

This is a point made over and over again on the many sites that criticize PolitiFact, sites that are written mostly from a conservative perspective. One of the most thorough smackdowns of PolitiFact since the beginning of this year is the blog PolitiFact Bias whose writers have a beef not only with the selection bias, but also with the rating process, the final rating itself, and the selection of the “lie of the year”. In one post, blogger Bryan White tries to make an objective case for partisan bias (“bolstered by methods of science”) by compiling a list of errors in PolitiFact ratings going back to 2008 and pointing out that the overwhelming majority of them are unfavorable to Republicans or favorable to Democrats. But how does he determine which PolitiFact ratings are errors? He used his own (conservative-leaning) judgment, fraught with even more partisan bias than PolitiFact.

The fact is that it’s pretty easy to claim that PolitiFact has a partisan bias based on anecdotal evidence, but it’s pretty much impossible to prove such a claim unless PolitiFact and a bunch of other fact checking websites analyze a given control set of statements that we the public could use to determine who is more biased, who is more nitpicky, etc. (actually, this would be a great idea).

But until that happens, one can only compare PolitiFact with itself. Specifically with the affiliated state sites that are run by different newspaper editorial boards and may exhibit a degree of independence from the national PolitiFact site. I was interested in knowing which newspapers’ editors exhibited the most political bias towards or against one party or the other. So I averaged the PolitiFact Truth Index across each state by political party and came up with the following results.



PolitiFact Texas and PolitiFact Ohio both exhibited an average Truth Index value above one standard deviation for Democrats, while PolitiFact Wisconsin showed a negative value for its Democrats (the only state with more Democratic lies than truths since the end of February). It could mean that local and state Democratic politicians in Texas and Ohio really do tell the truth more often than their counterparts in other states, but it could also be a measure of partisan bias towards Democrats by the editorial board of the Cleveland Plain Dealer and the Austin American Statesman.



PolitiFact Georgia and PolitiFact Virginia both exhibited an average Truth Index value above one standard deviation for Republicans, (the only states where Republicans have made more true statements than false ones) while Oregon and Wisconsin’s averages are lower than one standard deviation. Again, this could mean that Georgia’s and Virginia’s Republicans are more honest than everywhere else in the nation. But it could also mean that the Atlanta Journal Constitution and the Richmond Times-Dispatch employ fact-checkers with a Republican bias.

I thought it was interesting that both Wisconsin’s Democratic statements and Republican statements averaged below one standard deviation compared to statements made by fellow Democrats and Republicans in other states. I think this is at least partially due to the heated rhetoric due to the labor fight in that state, and specifically to the local and judicial elections that followed. As I published in Part 2, statements about elections (workings of government) and personal attacks tend to be rated false across the board, no matter which party makes the statement.

How does the national version of PolitiFact rank against the states in terms of the Truth Index?



For Democrats, PolitiFact National averages very close to the state average. For Republicans, PolitiFact National scores much lower than the state average. Since the National branch of PolitiFact tends to analyze statements with a national policy scope more than the state branches do, one can look at this fact a couple of different ways. One is that Republicans with a national scope lie more often than their state and local counterparts do. Or, PolitiFact National has an anti-Republican partisan bias. Or perhaps it could be due to some other factor. Maybe Republican pants-on-fire ratings drive more traffic to the website through links on Facebook and liberal blogs, and PolitiFact’s editors could be motivated to sample more of these statements than other types.

The debate about PolitiFact won’t subside, but I do think that the website is a useful mythbusting tool even if it does have a selection bias.

Next: Politicians versus political commentators

Tuesday, May 03, 2011

Politifact Truth Index, Part 2: Breakdown by Subject



Republicans lie. On average, more than Democrats. At least according to the statements analyzed by PolitiFact.com. But do they lie at a uniform rate across any given policy statement? Are there some subject matters where Democrats are more likely to stretch the truth than Republicans are? I had to find out.

I categorized each statement into a fairly broad policy subject based not only on the words of the statement, but also on the point that the statement is trying to make. If the politician is saying something about Medicare, for instance, the intent of the comment might be to urge hospitals to improve care (in which case the statement would be in the Health Care category), or the intent might be to reduce the cost to the government (in which case the statement would be placed in the Government Spending category).

My PolitiFact database breaks each statement down into one of twenty subjects. The distribution of these subjects is shown in the pie chart below.



Recently the biggest subject of discussion has been issues related to government spending as both sides have waged budget battles over the amount of federal money distributed to government programs. Similar subjects have also been popular such as debt/deficit, taxes and the economy. (If a statement refers to both taxes and spending, it is counted as a statement about the debt/deficit).

Labor issues are also popular subjects in this database due to the recent protests in Wisconsin (and also due to the fact that Wisconsin is one of the eight states where PolitiFact partners with a local newspaper for state and local fact-checking). Labor issues incorporate collective bargaining issues, statements made about the benefits of public employees, and statements made about unions.

Which types of statements are usually true across all parties? Government spending issues. Both Republicans and Democrats score highly in the PolitiFact Truth Index when it comes to issues relating to government spending, which is great because government spending issues are the largest category of statements. Politicians can be generally trusted when it comes to statements made about government spending.

Second Amendment issues also score highly across both parties, but only nine statements (five by Republicans, three by Democrats) have been recorded in my database. The dataset isn't high, so the numbers are subject to wild changes. Public safety issues (crime statistics mostly) also score highly, although there has only been one identifiably Republican statement made in regard to this subject. Most public safety statements are made by local police officers or other non-partisan figures.

Which types of statements are usually lies? Personal attacks or boasts are more often false than true no matter which party makes the claim. Personal attacks are most often made in the heat of campaigns and encompass claims about an opponent's background or some statement he or she made in the past. It is the category of all the birther claims, but also of the accusations that certain politicians said they wanted to get rid of Medicare or other government programs.

Statements about the workings of government are also usually lies across all parties. This category includes statements about elections, including polls, demography, gerrymandering, vote counting and allegations of fraud; and it also includes statements about congressional rules and bureaucracy. Don't trust a politician when he/she says something like a recent poll shows him/her statistically tied with Obama for the 2012 presidency.

Which types of statements show Republicans lying as Democrats tell the truth? Labor issues, taxes, Obamacare, social issues, and environmental issues. Democrats can point to these subjects as areas where Democrats are on the right side of the truth and Republicans are on the wrong side, usually.

Are there any subjects where Democrats lie as Republicans tell the truth? So far just one: transportation. And this cannot be relied on as there have only been two Republican and four Democratic statements made.

Here's the chart:



Here's the main point - Republicans do score better than Democrats in some sizeable subjects, namely government spending and education. But Democrats do better in almost everything else: the economy, labor issues, the workings of government, debt/deficit, taxes, personal attacks, Obamacare, social issues, energy issues, environmental issues, and health care issues. The difference is vast when it comes to social issues (abortion, gay rights), environmental issues, Obamacare and taxes. When it comes to these policy issues, one can not only not trust Republicans, but also one can usually trust Democrats.

Next: PolitiFact's state newspaper affiliates: are there biases?