CREAT: Census Research Exploration and Analysis Tool

Papers Containing Keywords(s): 'survey income'

The following papers contain search terms that you selected. From the papers listed below, you can navigate to the PDF, the profile page for that working paper, or see all the working papers written by an author. You can also explore tags, keywords, and authors that occur frequently within these papers.
Click here to search again

Frequently Occurring Concepts within this Search

Viewing papers 11 through 18 of 18


  • Working Paper

    A METHOD OF CORRECTING FOR MISREPORTING APPLIED TO THE FOOD STAMP PROGRAM

    May 2013

    Authors: Nikolas Mittag

    Working Paper Number:

    CES-13-28

    Survey misreporting is known to be pervasive and bias common statistical analyses. In this paper, I first use administrative data on SNAP receipt and amounts linked to American Community Survey data from New York State to show that survey data can misrepresent the program in important ways. For example, more than 1.4 billion dollars received are not reported in New York State alone. 46 percent of dollars received by house- holds with annual income above the poverty line are not reported in the survey data, while only 19 percent are missing below the poverty line. Standard corrections for measurement error cannot remove these biases. I then develop a method to obtain consistent estimates by combining parameter estimates from the linked data with publicly available data. This conditional density method recovers the correct estimates using public use data only, which solves the problem that access to linked administrative data is usually restricted. I examine the degree to which this approach can be used to extrapolate across time and geography, in order to solve the problem that validation data is often based on a convenience sample. I present evidence from within New York State that the extent of heterogeneity is small enough to make extrapolation work well across both time and geography. Extrapolation to the entire U.S. yields substantive differences to survey data and reduces deviations from official aggregates by a factor of 4 to 9 compared to survey aggregates.
    View Full Paper PDF
  • Working Paper

    BIAS IN FOOD STAMPS PARTICIPATION ESTIMATES IN THE PRESENCE OF MISREPORTING ERROR

    March 2013

    Authors: Cathleen Li

    Working Paper Number:

    CES-13-13

    This paper focuses on how survey misreporting of food stamp receipt can bias demographic estimation of program participation. Food stamps is a federally funded program which subsidizes the nutrition of low-income households. In order to improve the reach of this program, studies on how program participation varies by demographic groups have been conducted using census data. Census data are subject to a lot of misreporting error, both underreporting and over-reporting, which can bias the estimates. The impact of misreporting error on estimate bias is examined by calculating food stamp participation rates, misreporting rates, and bias for select household characteristics (covariates).
    View Full Paper PDF
  • Working Paper

    Occupation Inflation in the Current Population Survey

    September 2012

    Working Paper Number:

    CES-12-26

    A common caveat often accompanying results relying on household surveys regards respondent error. There is research using independent, presumably error-free administrative data, to estimate the extent of error in the data, the correlates of error, and potential corrections for the error. We investigate measurement error in occupation in the Current Population Survey (CPS) using the panel component of the CPS to identify those that incorrectly report changing occupation. We find evidence that individuals are inflating their occupation to higher skilled and higher paying occupations than the ones they actually perform. Occupation inflation biases the education and race coefficients in standard Mincer equation results within occupations.
    View Full Paper PDF
  • Working Paper

    Estimating Measurement Error in SIPP Annual Job Earnings: A Comparison of Census Bureau Survey and SSA Administrative Data

    July 2011

    Working Paper Number:

    CES-11-20

    We quantify sources of variation in annual job earnings data collected by the Survey of Income and Program Participation (SIPP) to determine how much of the variation is the result of measurement error. Jobs reported in the SIPP are linked to jobs reported in an administrative database, the Detailed Earnings Records (DER) drawn from the Social Security Administration's Master Earnings File, a universe file of all earnings reported on W-2 tax forms. As a result of the match, each job potentially has two earnings observations per year: survey and administrative. Unlike previous validation studies, both of these earnings measures are viewed as noisy measures of some underlying true amount of annual earnings. While the existence of survey error resulting from respondent mistakes or misinterpretation is widely accepted, the idea that administrative data are also error-prone is new. Possible sources of employer reporting error, employee under-reporting of compensation such as tips, and general differences between how earnings may be reported on tax forms and in surveys, necessitates the discarding of the assumption that administrative data are a true measure of the quantity that the survey was designed to collect. In addition, errors in matching SIPP and DER jobs, a necessary task in any use of administrative data, also contribute to measurement error in both earnings variables. We begin by comparing SIPP and DER earnings for different demographic and education groups of SIPP respondents. We also calculate different measures of changes in earnings for individuals switching jobs. We estimate a standard earnings equation model using SIPP and DER earnings and compare the resulting coefficients. Finally exploiting the presence of individuals with multiple jobs and shared employers over time, we estimate an econometric model that includes random person and firm effects, a common error component shared by SIPP and DER earnings, and two independent error components that represent the variation unique to each earnings measure. We compare the variance components from this model and consider how the DER and SIPP differ across unobservable components.
    View Full Paper PDF
  • Working Paper

    Errors in Survey Reporting and Imputation and Their Effects on Estimates of Food Stamp Program Participation

    April 2011

    Working Paper Number:

    CES-11-14

    Benefit receipt in major household surveys is often underreported. This misreporting leads to biased estimates of the economic circumstances of disadvantaged populations, program takeup, and the distributional effects of government programs, and other program effects. We use administrative data on Food Stamp Program (FSP) participation matched to American Community Survey (ACS) and Current Population Survey (CPS) household data. We show that nearly thirty-five percent of true recipient households do not report receipt in the ACS and fifty percent do not report receipt in the CPS. Misreporting, both false negatives and false positives, varies with individual characteristics, leading to complicated biases in FSP analyses. We then directly examine the determinants of program receipt using our combined administrative and survey data. The combined data allow us to examine accurate participation using individual characteristics missing in administrative data. Our results differ from conventional estimates using only survey data, as such estimates understate participation by single parents, non-whites, low income households, and other groups. To evaluate the use of Census Bureau imputed ACS and CPS data, we also examine whether our estimates using survey data alone are closer to those using the accurate combined data when imputed survey observations are excluded. Interestingly, excluding the imputed observations leads to worse ACS estimates, but has less effect on the CPS estimates.
    View Full Paper PDF
  • Working Paper

    Lessons for Targeted Program Evaluation: A Personal and Professional History of the Survey of Program Dynamics

    August 2007

    Authors: Daniel Weinberg

    Working Paper Number:

    CES-07-24

    The Survey of Program Dynamics (SPD) was created by the 1996 welfare reform legislation to facilitate its evaluation. This paper describes the evolution of that survey, discusses its implementation, and draws lessons for future evaluation. Large-scale surveys can be an important part of a portfolio of evaluation methods, but sufficient time must be given to data collection agencies if a high-quality longitudinal survey is expected. Such a survey must have both internal (agency) and external (policy analyst) buy-in. Investments in data analysis by agency staff, downplayed in favor of larger sample sizes given a fixed budget, could have contributed to more external acceptance. More attention up-front to reducing the potentially deleterious effects of attrition in longitudinal surveys, such as through the use of monetary incentives, might have been worthwhile. Given the problems encountered by the Census Bureau in producing the SPD, I argue that ongoing multi-purpose longitudinal surveys like the Survey of Income and Program Participation are potentially more valuable than episodic special-purpose surveys.
    View Full Paper PDF
  • Working Paper

    Using Administrative Earnings Records to Assess Wage Data Quality in the March Current Population Survey and the Survey of Income and Program Participation

    November 2002

    Authors: Marc Roemer

    Working Paper Number:

    tp-2002-22

    The March Current Population Survey (CPS) and the Survey of Income and Program Participation (SIPP) produce different aggregates and distributions of annual wages. An excess of high wages and shortage of low wages occurs in the March CPS. SIPP shows the opposite, an excess of low wages and shortage of high wages. Exactly-matched Detailed Earnings Records (DER) from the Social Security Administration allow comparing March CPS and SIPP people's wages using data independent of the surveys. Findings include the following. March CPS and SIPP people differ little in their true wage characteristics. March CPS and SIPP represent a worker's percentile rank better than the dollar amount of wages. Workers with one job and low work effort have underestimated March CPS wages. March CPS has a higher level of "underground" wages than SIPP, and increasingly so in the 1990s. March CPS has a higher level of self-employment income "misclassified" as wages than SIPP, and increasingly so in the 1990s. These trends may explain one-third of March CPS's 6-percentage-point increase in aggregate wages relative to independent estimates from 1993 to 1995. Finally, the paper delineates March CPS occupations disproportionately likely to be absent from the administrative data entirely or to "misclassify" self-employment income as wages.
    View Full Paper PDF
  • Working Paper

    Estimating Measurement Error in SIPP Annual Job Earnings: A Comparison of Census Survey and SSA Administrative Data

    September 2002

    Authors: Martha Stinson

    Working Paper Number:

    tp-2002-24

    The third chapter investigates measurement error in SIPP annual job earnings data linked to SSA administrative earnings data. The multiple earnings measures provided by the survey and administrative data enable the identification of components of true variation and variation due to measurement error. We find that 18% of the variation in SIPP annual job earnings can be attributed to measurement error. We also find that in both the SIPP and the DER, measurement error is persistent over time. A lower level of auto-correlation in the SIPP measurement error than in the economic error component leads to a lower reliability ratio of .62 for first-differenced earnings.
    View Full Paper PDF