Papers written by Author(s): 'Moises Yi'
The following papers contain search terms that you selected. From the papers listed below, you can navigate to the PDF, the profile page for that working paper, or see all the working papers written by an author. You can also explore tags, keywords, and authors that occur frequently within these papers.
See Working Papers by Tag(s), Keywords(s), Author(s), or Search Text
Click here to search again
Frequently Occurring Concepts within this Search
Viewing papers 1 through 9 of 9
-
Working PaperRevisions to the LEHD Establishment Imputation Procedure and Applications to Administrative Job Frame
September 2024
Working Paper Number:
CES-24-51
The Census Bureau is developing a 'job frame' to provide detailed job-level employment data across the U.S. through linked administrative records such as unemployment insurance and IRS W-2 filings. This working paper summarizes the research conducted by the job frame development team on modifying and extending the LEHD Unit-to-Worker (U2W) imputation procedure for the job frame prototype. It provides a conceptual overview of the U2W imputation method, highlighting key challenges and tradeoffs in its current application. The paper then presents four imputation methodologies and evaluates their performance in areas such as establishment assignment accuracy, establishment size matching, and job separation rates. The results show that all methodologies perform similarly in assigning workers to the correct establishment. Non-spell-based methodologies excel in matching establishment sizes, while spell-based methodologies perform better in accurately tracking separation rates.View Full Paper PDF
-
Working PaperRevisions to the LEHD Establishment Imputation Procedure and Applications to Administrative Jobs Frame
September 2024
Working Paper Number:
CES-24-51
The Census Bureau is developing a 'jobs frame' to provide detailed job-level employment data across the U.S. through linked administrative records such as unemployment insurance and IRS W-2 filings. This working paper summarizes the research conducted by the jobs frame development team on modifying and extending the LEHD Unit-to-Worker (U2W) imputation procedure for the jobs frame prototype. It provides a conceptual overview of the U2W imputation method, highlighting key challenges and tradeoffs in its current application. The paper then presents four imputation methodologies and evaluates their performance in areas such as establishment assignment accuracy, establishment size matching, and job separation rates. The results show that all methodologies perform similarly in assigning workers to the correct establishment. Non-spell-based methodologies excel in matching establishment sizes, while spell-based methodologies perform better in accurately tracking separation rates.View Full Paper PDF
-
Working PaperIndustry Wage Differentials: A Firm-Based Approach
August 2023
Working Paper Number:
CES-23-40
We revisit the estimation of industry wage differentials using linked employer-employee data from the U.S. LEHD program. Building on recent advances in the measurement of employer wage premiums, we define the industry wage effect as the employment-weighted average workplace premium in that industry. We show that cross-sectional estimates of industry differentials overstate the pay premiums due to unmeasured worker heterogeneity. Conversely, estimates based on industry movers understate the true premiums, due to unmeasured heterogeneity in pay premiums within industries. Industry movers who switch to higher-premium industries tend to leave firms in the origin sector that pay above-average premiums and move to firms in the destination sector with below-average premiums (and vice versa), attenuating the measured industry effects. Our preferred estimates reveal substantial heterogeneity in narrowly-defined industry premiums, with a standard deviation of 12%. On average, workers in higher-paying industries have higher observed and unobserved skills, widening between-industry wage inequality. There are also small but systematic differences in industry premiums across cities, with a wider distribution of pay premiums and more worker sorting in cities with more highpremium firms and high-skilled workers.View Full Paper PDF
-
Working PaperEstimating the U.S. Citizen Voting-Age Population (CVAP) Using Blended Survey Data, Administrative Record Data, and Modeling: Technical Report
April 2023
Working Paper Number:
CES-23-21
This report develops a method using administrative records (AR) to fill in responses for nonresponding American Community Survey (ACS) housing units rather than adjusting survey weights to account for selection of a subset of nonresponding housing units for follow-up interviews and for nonresponse bias. The method also inserts AR and modeling in place of edits and imputations for ACS survey citizenship item nonresponses. We produce Citizen Voting-Age Population (CVAP) tabulations using this enhanced CVAP method and compare them to published estimates. The enhanced CVAP method produces a 0.74 percentage point lower citizen share, and it is 3.05 percentage points lower for voting-age Hispanics. The latter result can be partly explained by omissions of voting-age Hispanic noncitizens with unknown legal status from ACS household responses. Weight adjustments may be less effective at addressing nonresponse bias under those conditions.View Full Paper PDF
-
Working PaperLocation, Location, Location
October 2021
Working Paper Number:
CES-21-32R
We use data from the Longitudinal Employer-Household Dynamics program to study the causal effects of location on earnings. Starting from a model with employer and employee fixed effects, we estimate the average earnings premiums associated with jobs in different commuting zones (CZs) and different CZ-industry pairs. About half of the variation in mean wages across CZs is attributable to differences in worker ability (as measured by their fixed effects); the other half is attributable to place effects. We show that the place effects from a richly specified cross sectional wage model overstate the causal effects of place (due to unobserved worker ability), while those from a model that simply adds person fixed effects understate the causal effects (due to unobserved heterogeneity in the premiums paid by different firms in the same CZ). Local industry agglomerations are associated with higher wages, but overall differences in industry composition and in CZ-specific returns to industries explain only a small fraction of average place effects. Estimating separate place effects for college and non-college workers, we find that the college wage gap is bigger in larger and higher-wage places, but that two-thirds of this variation is attributable to differences in the relative skills of the two groups in different places. Most of the remaining variation reflects the enhanced sorting of more educated workers to higher-paying industries in larger and higher-wage CZs. Finally, we find that local housing costs at least fully offset local pay premiums, implying that workers who move to larger CZs have no higher net-of-housing consumption.View Full Paper PDF
-
Working PaperDetermination of the 2020 U.S. Citizen Voting Age Population (CVAP) Using Administrative Records and Statistical Methodology Technical Report
October 2020
Working Paper Number:
CES-20-33
This report documents the efforts of the Census Bureau's Citizen Voting-Age Population (CVAP) Internal Expert Panel (IEP) and Technical Working Group (TWG) toward the use of multiple data sources to produce block-level statistics on the citizen voting-age population for use in enforcing the Voting Rights Act. It describes the administrative, survey, and census data sources used, and the four approaches developed for combining these data to produce CVAP estimates. It also discusses other aspects of the estimation process, including how records were linked across the multiple data sources, and the measures taken to protect the confidentiality of the data.View Full Paper PDF
-
Working PaperPredicting the Effect of Adding a Citizenship Question to the 2020 Census
June 2019
Working Paper Number:
CES-19-18
The addition of a citizenship question to the 2020 census could affect the self-response rate, a key driver of the cost and quality of a census. We find that citizenship question response patterns in the American Community Survey (ACS) suggest that it is a sensitive question when asked about administrative record noncitizens but not when asked about administrative record citizens. ACS respondents who were administrative record noncitizens in 2017 frequently choose to skip the question or answer that the person is a citizen. We predict the effect on self-response to the entire survey by comparing mail response rates in the 2010 ACS, which included a citizenship question, with those of the 2010 census, which did not have a citizenship question, among households in both surveys. We compare the actual ACS-census difference in response rates for households that may contain noncitizens (more sensitive to the question) with the difference for households containing only U.S. citizens. We estimate that the addition of a citizenship question will have an 8.0 percentage point larger effect on self-response rates in households that may have noncitizens relative to those with only U.S. citizens. Assuming that the citizenship question does not affect unit self-response in all-citizen households and applying the 8.0 percentage point drop to the 28.1 % of housing units potentially having at least one noncitizen would predict an overall 2.2 percentage point drop in self-response in the 2020 census, increasing costs and reducing the quality of the population count.View Full Paper PDF
-
Working PaperUnderstanding the Quality of Alternative Citizenship Data Sources for the 2020 Census
August 2018
Working Paper Number:
CES-18-38R
This paper examines the quality of citizenship data in self-reported survey responses compared to administrative records and evaluates options for constructing an accurate count of resident U.S. citizens. Person-level discrepancies between survey-collected citizenship data and administrative records are more pervasive than previously reported in studies comparing survey and administrative data aggregates. Our results imply that survey-sourced citizenship data produce significantly lower estimates of the noncitizen share of the population than would be produced from currently available administrative records; both the survey-sourced and administrative data have shortcomings that could contribute to this difference. Our evidence is consistent with noncitizen respondents misreporting their own citizenship status and failing to report that of other household members. At the same time, currently available administrative records may miss some naturalizations and capture others with a delay. The evidence in this paper also suggests that adding a citizenship question to the 2020 Census would lead to lower self-response rates in households potentially containing noncitizens, resulting in higher fieldwork costs and a lower-quality population count.View Full Paper PDF
-
Working PaperA Comparison of Training Modules for Administrative Records Use in Nonresponse Followup Operations: The 2010 Census and the American Community Survey
January 2017
Working Paper Number:
CES-17-47
While modeling work in preparation for the 2020 Census has shown that administrative records can be predictive of Nonresponse Followup (NRFU) enumeration outcomes, there is scope to examine the robustness of the models by using more recent training data. The models deployed for workload removal from the 2015 and 2016 Census Tests were based on associations of the 2010 Census with administrative records. Training the same models with more recent data from the American Community Survey (ACS) can identify any changes in parameter associations over time that might reduce the accuracy of model predictions. Furthermore, more recent training data would allow for the incorporation of new administrative record sources not available in 2010. However, differences in ACS methodology and the smaller sample size may limit its applicability. This paper replicates earlier results and examines model predictions based on the ACS in comparison with NRFU outcomes. The evaluation consists of a comparison of predicted counts and household compositions with actual 2015 NRFU outcomes. The main findings are an overall validation of the methodology using independent data.View Full Paper PDF