CREAT: Census Research Exploration and Analysis Tool

Papers Containing Keywords(s): 'population survey'

The following papers contain search terms that you selected. From the papers listed below, you can navigate to the PDF, the profile page for that working paper, or see all the working papers written by an author. You can also explore tags, keywords, and authors that occur frequently within these papers.
Click here to search again

Frequently Occurring Concepts within this Search

Viewing papers 1 through 10 of 10


  • Working Paper

    The Design of Sampling Strata for the National Household Food Acquisition and Purchase Survey

    February 2025

    Working Paper Number:

    CES-25-13

    The National Household Food Acquisition and Purchase Survey (FoodAPS), sponsored by the United States Department of Agriculture's (USDA) Economic Research Service (ERS) and Food and Nutrition Service (FNS), examines the food purchasing behavior of various subgroups of the U.S. population. These subgroups include participants in the Supplemental Nutrition Assistance Program (SNAP) and the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC), as well as households who are eligible for but don't participate in these programs. Participants in these social protection programs constitute small proportions of the U.S. population; obtaining an adequate number of such participants in a survey would be challenging absent stratified sampling to target SNAP and WIC participating households. This document describes how the U.S. Census Bureau (which is planning to conduct future versions of the FoodAPS survey on behalf of USDA) created sampling strata to flag the FoodAPS targeted subpopulations using machine learning applications in linked survey and administrative data. We describe the data, modeling techniques, and how well the sampling flags target low-income households and households receiving WIC and SNAP benefits. We additionally situate these efforts in the nascent literature on the use of big data and machine learning for the improvement of survey efficiency.
    View Full Paper PDF
  • Working Paper

    Nonresponse and Coverage Bias in the Household Pulse Survey: Evidence from Administrative Data

    October 2024

    Working Paper Number:

    CES-24-60

    The Household Pulse Survey (HPS) conducted by the U.S. Census Bureau is a unique survey that provided timely data on the effects of the COVID-19 Pandemic on American households and continues to provide data on other emergent social and economic issues. Because the survey has a response rate in the single digits and only has an online response mode, there are concerns about nonresponse and coverage bias. In this paper, we match administrative data from government agencies and third-party data to HPS respondents to examine how representative they are of the U.S. population. For comparison, we create a benchmark of American Community Survey (ACS) respondents and nonrespondents and include the ACS respondents as another point of reference. Overall, we find that the HPS is less representative of the U.S. population than the ACS. However, performance varies across administrative variables, and the existing weighting adjustments appear to greatly improve the representativeness of the HPS. Additionally, we look at household characteristics by their email domain to examine the effects on coverage from limiting email messages in 2023 to addresses from the contact frame with at least 90% deliverability rates, finding no clear change in the representativeness of the HPS afterwards.
    View Full Paper PDF
  • Working Paper

    Incorporating Administrative Data in Survey Weights for the Basic Monthly Current Population Survey

    January 2024

    Working Paper Number:

    CES-24-02

    Response rates to the Current Population Survey (CPS) have declined over time, raising the potential for nonresponse bias in key population statistics. A potential solution is to leverage administrative data from government agencies and third-party data providers when constructing survey weights. In this paper, we take two approaches. First, we use administrative data to build a non-parametric nonresponse adjustment step while leaving the calibration to population estimates unchanged. Second, we use administratively linked data in the calibration process, matching income data from the Internal Return Service and state agencies, demographic data from the Social Security Administration and the decennial census, and industry data from the Census Bureau's Business Register to both responding and nonresponding households. We use the matched data in the household nonresponse adjustment of the CPS weighting algorithm, which changes the weights of respondents to account for differential nonresponse rates among subpopulations. After running the experimental weighting algorithm, we compare estimates of the unemployment rate and labor force participation rate between the experimental weights and the production weights. Before March 2020, estimates of the labor force participation rates using the experimental weights are 0.2 percentage points higher than the original estimates, with minimal effect on unemployment rate. After March 2020, the new labor force participation rates are similar, but the unemployment rate is about 0.2 percentage points higher in some months during the height of COVID-related interviewing restrictions. These results are suggestive that if there is any nonresponse bias present in the CPS, the magnitude is comparable to the typical margin of error of the unemployment rate estimate. Additionally, the results are overall similar across demographic groups and states, as well as using alternative weighting methodology. Finally, we discuss how our estimates compare to those from earlier papers that calculate estimates of bias in key CPS labor force statistics. This paper is for research purposes only. No changes to production are being implemented at this time.
    View Full Paper PDF
  • Working Paper

    Coverage of Children in the American Community Survey Based on California Birth Records

    September 2023

    Authors: Gloria G. Aldana

    Working Paper Number:

    CES-23-46

    The U.S. Census Bureau's American Community Survey (ACS) collects information on individuals and households. The ACS provides survey-based estimates of children drawn from a sample of the U.S. population. However, survey responses may not match administrative records, such as birth records. Birth records should provide a complete account of all births, along with child-parent relationships and demographic characteristics. California is a state that has both a large population of children and a high undercount for young children. This paper uses California as a case study to examine differences between reported versus unreported children in the ACS based on state birth records. Child reporting rates were lower for more recent data years, younger children, for Black and Hispanic mothers, and for more complex households. Child reporting rates were higher for more educated mothers and for households above the poverty line. Using mother's race and Hispanic ethnicity from the birth records combined with poverty indices from the ACS, this analysis also finds that child reporting does not uniformly vary with poverty status across all race and ethnicity groups. This research builds support for the utility of state birth records in analyzing the undercount of children.
    View Full Paper PDF
  • Working Paper

    Small Business Pulse Survey Estimates by Owner Characteristics and Rural/Urban Designation

    September 2021

    Working Paper Number:

    CES-21-24

    In response to requests from policymakers for additional context for Small Business Pulse Survey (SBPS) measures of the impact of COVID-19 on small businesses, we researched developing estimates by owner characteristics and rural/urban locations. Leveraging geographic coding on the Business Register, we create estimates of the effect of the pandemic on small businesses by urban and rural designations. A more challenging exercise entails linking micro-level data from the SBPS with ownership data from the Annual Business Survey (ABS) to create estimates of the effect of the pandemic on small businesses by owner race, sex, ethnicity, and veteran status. Given important differences in survey design and concerns about nonresponse bias, we face significant challenges in producing estimates for owner demographics. We discuss our attempts to meet these challenges and provide discussion about caution that must be used in interpreting the results. The estimates produced for this paper are available for download. Reflecting the Census Bureau's commitment to scientific inquiry and transparency, the micro data from the SBPS will be available to qualified researchers on approved projects in the Federal Statistical Research Data Center network.
    View Full Paper PDF
  • Working Paper

    Reporting of Indian Health Service Coverage in the American Community Survey

    May 2018

    Working Paper Number:

    carra-2018-04

    Response error in surveys affects the quality of data which are relied on for numerous research and policy purposes. We use linked survey and administrative records data to examine reporting of a particular item in the American Community Survey (ACS) - health coverage among American Indians and Alaska Natives (AIANs) through the Indian Health Service (IHS). We compare responses to the IHS portion of the 2014 ACS health insurance question to whether or not individuals are in the 2014 IHS Patient Registration data. We evaluate the extent to which individuals misreport their IHS coverage in the ACS as well as the characteristics associated with misreporting. We also assess whether the ACS estimates of AIANs with IHS coverage represent an undercount. Our results will be of interest to researchers who rely on survey responses in general and specifically the ACS health insurance question. Moreover, our analysis contributes to the literature on using administrative records to measure components of survey error.
    View Full Paper PDF
  • Working Paper

    Within and Across County Variation in SNAP Misreporting: Evidence from Linked ACS and Administrative Records

    July 2014

    Working Paper Number:

    carra-2014-05

    This paper examines sub-state spatial and temporal variation in misreporting of participation in the Supplemental Nutrition Assistance Program (SNAP) using several years of the American Community Survey linked to SNAP administrative records from New York (2008-2010) and Texas (2006-2009). I calculate county false-negative (FN) and false-positive (FP) rates for each year of observation and find that, within a given state and year, there is substantial heterogeneity in FN rates across counties. In addition, I find evidence that FN rates (but not FP rates) persist over time within counties. This persistence in FN rates is strongest among more populous counties, suggesting that when noise from sampling variation is not an issue, some counties have consistently high FN rates while others have consistently low FN rates. This finding is important for understanding how misreporting might bias estimates of sub-state SNAP participation rates, changes in those participation rates, and effects of program participation. This presentation was given at the CARRA Seminar, June 27, 2013
    View Full Paper PDF
  • Working Paper

    Comparison of Survey, Federal, and Commercial Address Data Quality

    June 2014

    Authors: Quentin Brummet

    Working Paper Number:

    carra-2014-06

    This report summarizes matching of survey, commercial, and administrative records housing units to the Census Bureau Master Address File (MAF). We document overall MAF match rates in each data set and evaluate differences in match rates across a variety of housing characteristics. Results show that over 90 percent of records in survey data from the American Housing Survey (AHS) match to the MAF. Commercial data from CoreLogic matches at much lower rates, in part due to missing address information and poor match rates for multi-unit buildings. MAF match rates for administrative records from the Department of Housing and Urban Development are also high, and open the possibility of using this information in surveys such as the AHS.
    View Full Paper PDF
  • Working Paper

    Occupation Inflation in the Current Population Survey

    September 2012

    Working Paper Number:

    CES-12-26

    A common caveat often accompanying results relying on household surveys regards respondent error. There is research using independent, presumably error-free administrative data, to estimate the extent of error in the data, the correlates of error, and potential corrections for the error. We investigate measurement error in occupation in the Current Population Survey (CPS) using the panel component of the CPS to identify those that incorrectly report changing occupation. We find evidence that individuals are inflating their occupation to higher skilled and higher paying occupations than the ones they actually perform. Occupation inflation biases the education and race coefficients in standard Mincer equation results within occupations.
    View Full Paper PDF
  • Working Paper

    Measuring Labor Earnings Inequality Using Public-Use March Current Population Survey Data: The Value of Including Variances and Cell Means When Imputing Topcoded Values

    November 2008

    Working Paper Number:

    CES-08-38

    Using the Census Bureau's internal March Current Population Surveys (CPS) file, we construct and make available variances and cell means for all topcoded income values in the publicuse version of these data. We then provide a procedure that allows researchers with access only to the public-use March CPS data to take advantage of this added information when imputing its topcoded income values. As an example of its value we show how our new procedure improves on existing imputation methods in the labor earnings inequality literature.
    View Full Paper PDF