News & Politics  
comments_image Comments

Mainstream Media Has It Wrong: Voter Registration Rates For People of Color Are Not Falling

The Washington Post and others are using bad statistics. If anything Hispanics are slightly up from 2006 and African-Americas are significantly up.
 
 
Share
 
 
 
 

 

The Washington Post reports that voter registration is down among Blacks and Hispanics, and could pose a "serious challenge" to the Obama campaign.

Unfortunately, it is The Washington Posts' statistics that are seriously challenged.

The source of this information is the Census Bureau's Current Population Survey, also called the CPS. The CPS is a very reliable survey. The federal government uses the CPS to calculate the unemployment rate -- among many other important uses -- and expends considerable resources to ensure that it is accurate. The CPS has a huge sample size, an impressive response rate, and is meticulously scrutinized by the world's best survey researchers.

It would appear that I am leveling a very serious allegation that the CPS is flawed in disputing The Washington Post's reporting.

However, I do not have a problem with the CPS, I have a problem with how the Census Bureau reports voting and registration rates from the survey.

The CPS registration and voting statistics are reported from a limited number of questions asked on the CPS questionnaire in a November of an election year. These questions are very useful to those who are interested in elections because the CPS's large sample size allows fine-grained analysis of sub-populations, such as minorities or the disabled, which are simply not possible with typical smaller-sample election surveys.

To understand my contention that The Washington Post's analysis is flawed, I must explain how the voting and registration questions are asked.

The CPS asks a single person to report for all citizens age 18 and older living in a household if each person voted:

"In any election, some people are not able to vote because they are sick or busy or have some other reason, and others do not want to vote. Did (you/name) vote in the election held on ____?"

The permitted responses are a simple "Yes" and "No." However, the CPS reports additional response categories for those who don't know, refuse to answer the question, or do not provide a response. The Census Bureau treats these three additional response categories as a "No." This is problematic for a few reasons.

  • Perhaps someone does not wish to reveal if they voted, even if they did, and simple refuses to answer the question.
  • Perhaps someone reporting for another household member truly does not know if they voted.
  • And finally, "no response" literally means that the voting and registration supplemental questionnaire was not administered to a household member.

 

It is thus more appropriate to treat the three additional response categories of don't know, refuse to answer, and no response as missing responses since we do not really know if these respondents voted.

I present three turnout rates in Table 1. The actual voting eligible turnout rate is calculated by myself from the official administrative records and is widely considered by academics, the media, and policy makers to be the most accurate turnout rate. The second turnout rate is the official CPS turnout rate, as reported by the Census Bureau, and includes the missing responses as a "No." The third turnout rate excludes the missing responses from the calculations, calculated by myself from the Census Bureau's data.

2012-05-07-Rates1.jpg

Table 1 reveals why many scholars and others have trusted the CPS as the best source for turnout rates. All surveys have what is known as "over-report bias," the difference between the turnout rate on the survey and the actual turnout. The official CPS turnout rate has an exceptionally small over-report bias when counting missing responses as "No."