CPS Vote Over-Report and Non-Response Bias Correction

Administrative data cannot reveal who among the electorate voted. For a demographic profile of the electorate, we must turn to surveys.1 This page describes the methodology used here to reweight the Census Bureau's Current Population Survey, November Voting and Registration Supplement (or CPS for short) to account for respondent non-response and over-report bias.

Files

The following Stata (a statistical program) command files are provided to assist those who may wish to make their own weighting corrections to the CPS data, as recommended by Hur and Achen in a 2013 Public Opinion Quarterly article.

What is the CPS?

Among the most widely sited surveys is the CPS, a large survey used primarily to calculate the nation's unemployment rate. In the November of an election year the Census Bureau asks a limited number of voting and registration questions in a supplemental questionnaire. When cross-tabulated with extensive demographics, the CPS thus provides a comprehensive snapshot of electoral participation among various demographic groups, nationally and within states.

The CPS and Vote Over-Report Bias

Another attractive feature of the CPS is it's low vote over-report bias. Pollsters have long noted that poll respondents overstate their voting participation. A reason -- although not the only one -- is that people like to think of themselves in a favorable light within social norms, in this case presenting themselves to interviewers as voters even if they did not vote. While some election surveys' vote over-report bias can be ten percentage points or higher, the CPS is quite low. For example, the Census Bureau reported a 2012 CPS turnout rate of 61.8% whereas the 2012 VEP turnout rate was 58.6%, a difference of 3.2 percentage points.2 In comparison, the venerable American National Election Study reported a 2012 turnout rate of 78%.

The CPS and Non-Response

CPS respondents participate in the lengthy employment questionnaire, but when asked voting and registration questions, they are less willing to provide responses. The entire trend over the history of the CPS is presented in Figure 1. For example in 2012, 12.8% of CPS respondents either were never asked the voting and registration supplement questions (No Response), refused to answer the questions (Refused), or did not know an answer to the voting question (Don't Know).

Pollsters typically count these data as missing and remove them from calculations, but not the Census Bureau. The Census Bureau counts all respondents with missing data as people who did not vote. To underscore how poor of an practice this is, the Census Bureau counts persons who were never asked the voting question as having not voted.

The low vote over-report bias of the CPS, one of its primary virtues, is largely an artifact of the Census Bureau's non-standard practice of counting respondents with missing data as not voting. When one follows the standard practice of excluding the missing data, the CPS 2012 turnout rate is 70.7%, a difference of 12.1 percentage points with the 2012 VEP turnout rate.

The CPS and Vote Over-Report Bias Among States

The CPS's over-report bias is not constant across states. Some states have higher levels than others, as measured by subtracting the 2012 VEP Turnout Rates from the CPS's Citizen Voting-Age Population turnout rates, as is done in the figure below. These errors are systematic over several years, in that the same states tend to have the highest and lowest levels of vote over-report bias.

A couple of examples of the important consequences of these errors. First, the Census Bureau reports that Mississippi had the highest 2012 turnout rate, of 74.5%. This is simply implausible on face value and cannot be supported by the actual election results. Second, when Chief Justice Roberts wrote in the important voting rights decision, Shelby County v. Holder, overturning key components of the Voting Rights Act that Mississippi African-Americans have the same registration and participation levels as those in Massachusetts, this was similarly an artifact of systematic self-reporting errors in the CPS.

What to Do?

In a 2013 article in Public Opinion Quarterly, the top survey research journal, Aram Hur and Christopher Achen suggest excluding the non-responses to the CPS and reweighting the resulting CPS state level voter turnout rates to the VEP turnout rates disseminated on this website. They did not provide their weights. To assist the survey research community in making this correction, the associated Stata files for DIY are provided at the top of this page. Selected statistics using these corrected weights are reported here.

While these corrections are most likely superior to uncorrected CPS weights, research on the source of the vote over-report biases on the CPS is ongoing and may yield future recommended improvements.

Footnotes

1 Some states' voter registration files contain limited demographic information -- age, gender and race -- that enables the calculation of citizen voting-age population turnout rates by these demographic categories.

2 There are some differences between the CPS and the VEP that should not greatly affect this statistic. The CPS sample frame is the non-institutional resident citizen voting-age population of the United States, whereas the VEP is the total citizen-voting age population adjusted for ineligible felons and overseas citizens.