-
Productivity Growth Patterns in U.S. Food Manufacturing: Case of Dairy Products Industry
May 2004
Working Paper Number:
CES-04-08
A panel constructed from the Census Bureau's Longitudinal Research Database is used to measure total factor productivity growth at the plant-level and analyzes the multifactor bias of technical change at three-digit product group level containing five different four-digit sub-group categories for the U.S. dairy products industry from 1972 through 1995. In the TFP growth decomposition, analyzing the growth and its components according to the quartile ranks show that scale effect is the most significant element of TFP growth except the plants in the third quartile rank where technical change dominates throughout the time periods. The exogenous input bias results show that throughout the time periods, technical change is 1) capital-using; 2) labor-using after 1980; 3) material-saving except 1981-1985 period; and, 4) energy-using except 1981-1985 and 1991-1995 periods. Plant productivity analysis indicate that less than 50% of the plants in the dairy products industry stay in the same category, indicating considerable movement between productivity rank categories. Investment analysis results indicate that plant-level investments are quite lumpy since a relatively small percent of observations account for a disproportionate share of overall investment. Productivity growth is found to be positively correlated with recent investment spikes for plants with TFP ranking in the middle two quartiles and uncorrelated with plants in the smallest and largest quartiles. Similarly, past TFP growth rates present no significant correlation with future investment spikes for plants in any quartile.
View Full
Paper PDF
-
Location, Location, Location: The 3L Approach to House Price Determination
May 2004
Working Paper Number:
CES-04-06
The immobility of houses means that their location affects their values. This explains the common belief that three things determine the price of a house: location, location, and location. We use this notion to develop the 3L Approach to house price determination. That is, prices are determined by the Metropolitan Statistical Area (MSA), town, and street where the house is located. This study creates a unique data set based on data from the American Housing Survey (AHS) consisting of small 'clusters' of housing units with information on their housing characteristics and resident characteristics that is merged with census tract-level attributes. We use this data to verify the 3L Approach: we find that all three levels of location are significant when estimating the house price hedonic equation. This indicates that individuals care about their local neighborhood, i.e. the general upkeep of their street and possibly their neighbors' characteristics (cluster variables), a broader area such as the school district and/or the town (tract variables) that account for school quality and crime rates, and the particular amenities found in their MSA.
View Full
Paper PDF
-
Do Tax Incentives Affect Local Economic Growth? What Mean Impacts Miss in the Analysis of Enterprise Zone Policies
September 2003
Working Paper Number:
CES-03-17
Geographically-targeted tax incentives remain popular initiatives in response to deteriorating economic conditions of urban and industrial areas. This paper exploits the exogenous variations of the U.S. state Enterprise Zone programs to estimate the impact of various incentive features on a number of dimensions of local economic growth. The econometric analysis uses plant level data to sort out growth outcomes into gross flows separately accounted for by new, existing, and vanishing businesses in the target areas. Results offer empirical evidence to support a number of specific policy recommendations and show that the impact of the incentives has more complex dynamics than those revealed by the null mean impact estimates obtained from analyzing net growth outcomes.
View Full
Paper PDF
-
Pollution Abatement Expenditures and Plant-Level Productivity: A Production Function Approach
August 2003
Working Paper Number:
CES-03-16
In this paper, we investigate the impact of environmental regulation on productivity using a Cobb-Douglas production function framework. Estimating the effects of regulation on productivity can be done with a top-down approach using data for broad sectors of the economy, or a more disaggregated bottom-up approach. Our study follows a bottom-up approach using data from the U.S. paper, steel, and oil industries. We measure environmental regulation using plant-level information on pollution abatement expenditures, which allows us to distinguish between productive and abatement expenditures on each input. We use annual Census Bureau information (1979-1990) on output, labor, capital, and material inputs, and pollution abatement operating costs and capital expenditures for 68 pulp and paper mills, 55 oil refineries, and 27 steel mills. We find that pollution abatement inputs generally contribute little or nothing to output, especially when compared to their '''productive''' equivalents. Adding an aggregate pollution abatement cost measure to a Cobb-Douglas production function, we find that a $1 increase in pollution abatement costs leads to an estimated productivity decline of $3.11, $1.80, and $5.98 in the paper, oil, and steel industries respectively. These findings imply substantial differences across industries in their sensitivity to pollution abatement costs, arguing for a bottom-up approach that can capture these differences. Further differentiating plants by their production technology, we find substantial differences in the impact of pollution abatement costs even within industries, with higher marginal costs at plants with more polluting technologies. Finally, in all three industries, plants concentrating on change-in-production-process abatement techniques have higher productivity than plants doing predominantly end-of-line abatement, but also seem to be more affected by pollution abatement operating costs. Overall, our results point to the importance using detailed, disaggregated analyses, even below the industry level, when trying to model the costs of forcing plants to reduce their emissions.
View Full
Paper PDF
-
The Role of Technological and Industrial Heterogeneity In Technology Diffusion: a Markovian Approach
February 2003
Working Paper Number:
CES-03-07
Recent empirical studies have established the importance of intra and inter-industry heterogeneity in investment in innovation and other outcomes. This paper examines the role of industry and technology heterogeneity in the diffusion of advanced manufacturing technologies from a simple Markovian approach. Using the Maximum Entropy estimator, I estimate transition probabilities and corresponding half-lives, look for outliers in technology and industry diffusion patterns, and try to find explanations of their unusual behavior in idiosyncratic technology and industry characteristics. A consistent industry-level pattern that emerged is one that relates consumer demand and production processes. It seems that in industries where hand-made products are a sign of quality to the customer, technology spreads very slowly. On the other hand, in industries where demand for sophisticated, high-precision goods is high or in industries where demand-driven product specifications vary quite rapidly over relatively short periods of time, advanced technologies diffuse much more rapidly.
View Full
Paper PDF
-
The Survival of Industrial Plants
October 2002
Working Paper Number:
CES-02-25
The study seeks to explain the attrition rate of new manufacturing plants in the United States in terms of three vectors of variables. The first explains how survival of the fittest proceeds through learning by firms (plants) about their own relative efficiency. The second explains how efficiency systematically changes over time and what augments or diminishes it. The third captures the opportunity cost of resources employed in a plant. The model is tested using maximum-likelihood probit analysis with very large samples for successive census years in the 1967-97 period. One sample consists of an unbalanced panel of about three-fourths of a million plants of single and multi-unit firms, or alternatively of about 300,000 plants if only the most reliable data are considered. The second is restricted to the plants of multi-unit firms in the same time span and consists of an unbalanced panel of more than 100,000 plants. The empirical analysis strongly confirms the predictions of the model.
View Full
Paper PDF
-
Agent Heterogeneity and Learning: An Application to Labor Markets
October 2002
Working Paper Number:
tp-2002-20
I develop a matching model with heterogeneous workers, rms, and worker-firm
matches, and apply it to longitudinal linked data on employers and employees. Workers
vary in their marginal product when employed and their value of leisure when unemployed.
Firms vary in their marginal product and cost of maintaining a vacancy. The
marginal product of a worker-firm match also depends on a match-specific interaction
between worker and rm that I call match quality. Agents have complete information
about worker and rm heterogeneity, and symmetric but incomplete information about
match quality. They learn its value slowly by observing production outcomes. There
are two key results. First, under a Nash bargain, the equilibrium wage is linear in a
person-specific component, a firm-specific component, and the posterior mean of beliefs
about match quality. Second, in each period the separation decision depends only on
the posterior mean of beliefs and person and rm characteristics. These results have
several implications for an empirical model of earnings with person and rm eects.
The rst implies that residuals within a worker-firm match are a martingale; the second
implies the distribution of earnings is truncated.
I test predictions from the matching model using data from the Longitudinal
Employer-Household Dynamics (LEHD) Program at the US Census Bureau. I present
both xed and mixed model specifications of the equilibrium wage function, taking
account of structural aspects implied by the learning process. In the most general
specification, earnings residuals have a completely unstructured covariance within a
worker-firm match. I estimate and test a variety of more parsimonious error structures,
including the martingale structure implied by the learning process. I nd considerable
support for the matching model in these data.
View Full
Paper PDF
-
Estimating Measurement Error in SIPP Annual Job Earnings: A Comparison of Census Survey and SSA Administrative Data
September 2002
Working Paper Number:
tp-2002-24
The third chapter investigates measurement error in SIPP annual job
earnings data linked to SSA administrative earnings data. The multiple
earnings measures provided by the survey and administrative data enable
the identification of components of true variation and variation due to
measurement error. We find that 18% of the variation in SIPP annual job
earnings can be attributed to measurement error. We also find that in
both the SIPP and the DER, measurement error is persistent over time.
A lower level of auto-correlation in the SIPP measurement error than in
the economic error component leads to a lower reliability ratio of .62 for
first-differenced earnings.
View Full
Paper PDF
-
The Mis-Measurement of Permanent Earnings: New Evidence from Social Security Earnings Data
May 2002
Working Paper Number:
CES-02-12
This study investigates the reliability of using short-term averages of earnings as a proxy for permanent earnings in empirical research. An earnings dynamics model is estimated on a large sample of men covering the period from 1983 to 1997 following the cohort-based methodology of Baker and Solon (1999). The analysis uses a unique dataset that matches men in the 1984, 1990 and 1996 Surveys of Income and Program Participation (SIPP) to the Social Security Administration's Summary Earnings Records (SER). The results confirm that using a short-term average of earnings can lead to spurious estimates of the effect of lifetime earnings on a particular outcome. In addition, the transitory variance appears to vary considerably over the lifecycle. The share of earnings variance due to transitory factors is higher among blacks and the persistence of transitory shocks appears to be greater for this group as well. Finally, the transitory variance appears to be a more important factor in explaining the overall earnings variance of college educated men than those without college.
View Full
Paper PDF
-
The Measurement of Human Capital in the U.S. Economy
April 2002
Working Paper Number:
tp-2002-09
We develop a new approach to measuring human capital that permits the distinction of both observable
and unobservable dimensions of skill by associating human capital with the portable part
of an individual's wage rate. Using new large-scale, integrated employer-employee data containing
information on 68 million individuals and 3.6 million firms, we explain a very large proportion
(84%) of the total variation in wages rates and attribute substantial variation to both individual
and employer heterogeneity. While the wage distribution remained largely unchanged between
1992-1997, we document a pronounced right shift in the overall distribution of human capital.
Most workers entering our sample, while less experienced, were otherwise more highly skilled, a
difference which can be attributed almost exclusively to unobservables. Nevertheless, compared
to exiters and continuers, entrants exhibited a greater tendency to match to firms paying below
average internal wages. Firms reduced employment shares of low skilled workers and increased
employment shares of high skilled workers in virtually every industry. Our results strongly suggest
that the distribution of human capital will continue to shift to the right, implying a continuing
up-skilling of the employed labor force.
View Full
Paper PDF