SSRN - CFDP Series

Estimating the Production Function for Human Capital: Results from a Randomized Controlled Trial in Colombia

April 27, 2017 - 12:56pm
Orazio Attanasio, Sarah Cattan, Emla Fitzsimons, Costas Meghir and Marta Rubio-Codina
We examine the channels through which a randomized early childhood intervention in Colombia led to significant gains in cognitive and socio-emotional skills among a sample of disadvantaged children aged 12 to 24 months at baseline. We estimate the determinants of material and time investments in these children and evaluate the impact of the treatment on such investments. We then estimate the production functions for cognitive and socio-emotional skills. The effects of the program can be explained by increases in parental investments, which have strong effects on outcomes and are complementary to both maternal skills and child’s baseline skills.
Categories: Publications

The Incidence of Carbon Taxes in U.S. Manufacturing: Lessons from Energy Cost Pass-Through

April 17, 2017 - 3:24pm
Sharat Ganapati, Joseph S. Shapiro and Reed Walker
This paper studies how changes in energy input costs for U.S. manufacturers affect the relative welfare of manufacturing producers and consumers (i.e. incidence). In doing so, we develop a partial equilibrium methodology to estimate the incidence of input taxes that can simultaneously account for three determinants of incidence that are typically studied in isolation: incomplete pass-through of input costs, differences in industry competitiveness, and factor substitution amongst inputs used for production. We apply this methodology to a set of U.S. manufacturing industries for which we observe plant-level unit prices and input choices. We find that about 70 percent of energy price-driven changes in input costs are passed through to consumers. We combine industry-specific pass-through rates with estimates of industry competitiveness to show that the share of welfare cost borne by consumers is 25-75 percent smaller (and the share borne by producers is correspondingly larger) than models featuring complete pass-through and perfect competition would suggest.
Categories: Publications

Defensive Investments and the Demand for Air Quality: Evidence from the NOx Budget Program

April 9, 2017 - 10:12am
Olivier Deschenes, Michael Greenstone and Joseph S. Shapiro
The demand for air quality depends on health impacts and defensive investments, but little research assesses the empirical importance of defenses. A rich quasi-experiment suggests that the Nitrogen Oxides (NOx) Budget Program (NBP), a cap-and-trade market, decreased NOx emissions, ambient ozone concentrations, pharmaceutical expenditures, and mortality rates. The annual reductions in pharmaceutical purchases, a key defensive investment, and mortality are valued at about $800 million and $1.1 billion, respectively, suggesting that defenses are over one-third of willingness-to-pay for reductions in NOx emissions. Further, estimates indicate that the NBP’s benefits easily exceed its costs and that NOx reductions have substantial benefits.
Categories: Publications

Multidimensional Sales Incentives in CRM Settings: Customer Adverse Selection and Moral Hazard

April 5, 2017 - 9:50am
Minkyung Kim, K. Sudhir, Kosuke Uetake and Rodrigo Canales
In many firms, incentivized salespeople with private information about their customers are responsible for customer relationship management (CRM). Private information can help the firm by increasing sales efficiency, but it can also hurt the firm if salespeople use it to maximize own compensation at the expense of the firm. Specifically, we consider two negative outcomes due to private information — ex-ante customer adverse selection at the time of acquisition and ex-post customer moral hazard after acquisition. This paper investigates potential positive and negative responses of a salesforce to managerial levers — multidimensional incentives for acquisition and retention performance and job transfers that affect the level of private information. Salespeople are responsible for managing customer relationships and compensated through multidimensional performance incentives for customer acquisition and maintenance at many firms. This paper investigates how a salesperson’s private information on customers affect their response to multiple dimensions of incentives. Using unique matched panel data that links individual salesperson performance metrics with customer level loans and repayments from a microfinance bank, we find that sales people indeed possess private information that is not available to the firm. Salespeople use the private information to engage in adverse selection of customers in response to acquisition incentives. Customer maintenance incentives serve a dual purpose; they not only reduce loan defaults, but also moderate adverse selection in customer acquisition. Transfers that eliminate private information reduces the adverse selection effects of acquisition incentives, but increase loan defaults — customer moral hazard. Despite the potential negative adverse selection effects due to private information, the effort increasing effect of each of the three dimensions of sales management we investigate — acquisition incentive, maintenance incentive and transfers all have a net positive effect on firm value. Methodologically, the paper introduces an identification strategy to separate customer adverse selection and customer moral hazard (loan repayment), by leveraging the multidimensional incentives of an intermediary (salesperson) responsible for both customer selection and repayment with private information about customers.
Categories: Publications

Information Design: A Unified Perspective

March 28, 2017 - 8:50pm
Dirk Bergemann and Stephen Morris
Fixing a game with uncertain payoffs, information design identifies the information structure and equilibrium that maximizes the payoff of an information designer. We show how this perspective unifies existing work, including that on communication in games (Myerson (1991)), Bayesian persuasion (Kamenica and Gentzkow (2011)) and some of our own recent work. Information design has a literal interpretation, under which there is a real information designer who can commit to the choice of the best information structure (from her perspective) for a set of participants in a game. We emphasize a metaphorical interpretation, under which the information design problem is used by the analyst to characterize play in the game under many different information structures.
Categories: Publications

Evolution of Modeling of the Economics of Global Warming: Changes in the Dice Model, 1992-2017

March 13, 2017 - 7:02pm
William D. Nordhaus
Many areas of the natural and social sciences involve complex systems that link together multiple sectors. Integrated assessment models (IAMs) are approaches that integrate knowledge from two or more domains into a single framework, and these are particularly important for climate change. One of the earliest IAMs for climate change was the DICE/RICE family of models, first published in Nordhaus (1992), with the latest version in Nordhaus (2017, 2017a). A difficulty in assessing IAMs is the inability to use standard statistical tests because of the lack of a probabilistic structure. In the absence of statistical tests, the present study examines the extent of revisions of the DICE model over its quarter-century history. The study find that the major revisions have come primarily from the economic aspects of the model, whereas the environmental changes have been much smaller. Particularly sharp revisions have occurred for global output, damages, and the social cost of carbon. These results indicate that the economic projections are the least precise parts of IAMs and deserve much greater study than has been the case up to now, especially careful studies of long-run economic growth (to 2100 and beyond).
Categories: Publications

Tribute to T. W. Anderson

March 13, 2017 - 3:43pm
Peter C. B. Phillips
Professor T.W. Anderson passed away on September 17, 2016 at the age of 98 years after an astonishing career that spanned more than seven decades. Standing at the nexus of the statistics and economics professions, Ted Anderson made enormous contributions to both disciplines, playing a significant role in the birth of modern econometrics with his work on structural estimation and testing in the Cowles Commission during the 1940s, and educating successive generations through his brilliant textbook expositions of time series and multivariate analysis. This article is a tribute to his many accomplishments.
Categories: Publications

John Denis Sargan at the London School of Economics

March 13, 2017 - 3:38pm
David F. Hendry and Peter C. B. Phillips
During his period at the LSE from the early 1960s to the mid 1980s, John Denis Sargan rose to international prominence and the LSE emerged as the world’s leading centre for econometrics. Within this context, we examine the life of Denis Sargan, describe his major research accomplishments, recount the work of his many doctoral students, and track this remarkable period that constitutes the Sargan era of econometrics at the LSE.
Categories: Publications

Econometric Measurement of Earth's Transient Climate Sensitivity

March 13, 2017 - 1:04pm
Peter C. B. Phillips, Thomas Leirvik and Trude Storelvmo
How sensitive is Earth’s climate to a given increase in atmospheric greenhouse gas (GHG) concentrations? This long-standing and fundamental question in climate science was recently analyzed by dynamic panel data methods using extensive spatiotemporal data of global surface temperatures, solar radiation, and GHG concentrations over the last half century to 2010 (Storelvmo et al, 2016). These methods revealed that atmospheric aerosol effects masked approximately one-third of the continental warming due to increasing GHG concentrations over this period, thereby implying greater climate sensitivity to GHGs than previously thought. The present study provides asymptotic theory justifying the use of these methods when there are stochastic process trends in both the global forcing variables, such as GHGs, and station-level trend effects from such sources as local aerosol pollutants. These asymptotics validate confidence interval construction for econometric measures of Earth’s transient climate sensitivity. The methods are applied to observational data and to data generated from three leading global climate models (GCMs) that are sampled spatio-temporally in the same way as the empirical observations. The findings indicate that estimates of transient climate sensitivity produced by these GCMs lie within empirically determined confidence limits but that the GCMs uniformly underestimate the effects of aerosol induced dimming effects. The analysis shows the potential of econometric methods to calibrate GCM performance against observational data and to reveal the respective sensitivity parameters (GHG and non-GHG related) governing GCM temperature trends.
Categories: Publications

Research Design Meets Market Design: Using Centralized Assignment for Impact Evaluation

March 7, 2017 - 10:26am
Atila Abdulkadiroglu, Joshua D. Angrist, Yusuke Narita and Parag A. Pathak
A growing number of school districts use centralized assignment mechanisms to allocate school seats in a manner that reflects student preferences and school priorities. Many of these assignment schemes use lotteries to ration seats when schools are oversubscribed. The resulting random assignment opens the door to credible quasi-experimental research designs for the evaluation of school effectiveness. Yet the question of how best to separate the lottery-generated variation integral to such designs from non-random preferences and priorities remains open. This paper develops easily-implemented empirical strategies that fully exploit the random assignment embedded in a wide class of mechanisms, while also revealing why seats are randomized at one school but not another. We use these methods to evaluate charter schools in Denver, one of a growing number of districts that combine charter and traditional public schools in a unified assignment system. The resulting estimates show large achievement gains from charter school attendance. Our approach generates efficiency gains over ad hoc methods, such as those that focus on schools ranked first, while also identifying a more representative average causal effect. We also show how to use centralized assignment mechanisms to identify causal effects in models with multiple school sectors.
Categories: Publications

Zone Pricing in Retail Oligopoly

March 2, 2017 - 1:16pm
Brian Adams and Kevin R. Williams
We quantify the welfare effects of zone pricing, or setting common prices across distinct markets, in retail oligopoly. Although monopolists can only increase profits by price discriminating, this need not be true when firms face competition. With novel data covering the retail home improvement industry, we find that Home Depot would benefit from finer pricing but that Lowe’s would prefer coarser pricing. The use of zone pricing softens competition in markets where firms compete, but it shields consumers from higher prices in markets where firms might otherwise exercise market power. Overall, zone pricing produces higher consumer surplus than finer pricing discrimination does.
Categories: Publications

The Scope of Sequential Screening with Ex-Post Participation Constraints

February 22, 2017 - 12:19pm
Dirk Bergemann, Francisco Castro and Gabriel Y. Weintraub
We study the classic sequential screening problem under ex-post participation constraints. Thus the seller is required to satisfy buyers’ ex-post participation constraints. A leading example is the online display advertising market, in which publishers frequently cannot use up-front fees and instead use transaction-contingent fees. We establish when the optimal selling mechanism is static (buyers are not screened) or dynamic (buyers are screened), and obtain a full characterization of such contracts. We begin by analyzing our model within the leading case of exponential distributions with two types. We provide a necessary and sufficient condition for the optimality of the static contract. If the means of the two types are sufficiently close, then no screening is optimal. If they are sufficiently apart, then a dynamic contract becomes optimal. Importantly, the latter contract randomizes the low type buyer while giving a deterministic allocation to the high type. It also makes the low type worse-off and the high type better-off compared to the contract the seller would offer if he knew the buyer’s type. Our main result establishes a necessary and sufficient condition under which the static contract is optimal for general distributions. We show that when this condition fails, a dynamic contract that randomizes the low type buyer is optimal.
Categories: Publications

Global Collateral: How Financial Innovation Drives Capital Flows and Increases Financial Instability

February 22, 2017 - 12:15pm
Ana Fostel, John Geanakoplos and Gregory Phelan
We show that cross-border financial flows arise when countries differ in their abilities to use assets as collateral. Financial integration is a way of sharing scarce collateral. The ability of one country to leverage and tranche assets provides attractive financial contracts to investors in the other country, and general equilibrium effects on prices create opportunities for investors in the financially advanced country to invest abroad. Foreign demand for collateral and for collateral-backed financial promises increases the collateral value of domestic assets, and cheap foreign assets provide attractive returns to investors who do not demand collateral to issue promises. Gross global flows respond dynamically to fundamentals, exporting and amplifying financial volatility.
Categories: Publications

Inefficient Liquidity Provision

February 22, 2017 - 12:14pm
John Geanakoplos and Kieran James Walsh
We prove that in competitive market economies with no insurance for idiosyncratic risks, agents will always overinvest in illiquid long term assets and underinvest in short term liquid assets. We take as our setting the seminal model of Diamond and Dybvig (1983), who Örst posed the question in a tractable model. We reach such a simple conclusion under mild conditions because we stick to the basic competitive market framework, avoiding the banks and intermediaries that Diamond and Dybvig and others introduced.
Categories: Publications

Money and Status: How Best to Incentivize Work

February 22, 2017 - 12:13pm
Pradeep K. Dubey and John Geanakoplos
Status is greatly valued in the real world, yet it has not received much attention from economic theorists. We examine how the owner of a firm can best combine money and status to get her employees to work hard for the least total cost. We find that she should motivate workers of low skill mostly by status and high skill mostly by money. Moreover, she should do so by using a small number of titles and wage levels. This often results in star wages to the elite performers and, more generally, in wage jumps for small increases in productivity. By analogy, the governance of a society should pay special attention to the status concerns of ordinary citizens, which may often be accomplished by reinforcing suitable social norms.
Categories: Publications

Information Design: A Unified Perspective

February 17, 2017 - 9:21pm
Dirk Bergemann and Stephen Morris
Fixing a game with uncertain payoffs, information design identifies the information structure and equilibrium that maximizes the payoff of an information designer. We show how this perspective unifies existing work, including that on communication in games (Myerson (1991)), Bayesian persuasion (Kamenica and Gentzkow (2011)) and some of our own recent work. Information design has a literal interpretation, under which there is a real information designer who can commit to the choice of the best information structure (from her perspective) for a set of participants in a game. We emphasize a metaphorical interpretation, under which the information design problem is used by the analyst to characterize play in the game under many different information structures.
Categories: Publications

Optimal Sup-Norm Rates and Uniform Inference on Nonlinear Functionals of Nonparametric IV Regression

February 15, 2017 - 10:01am
Xiaohong Chen and Timothy Christensen
This paper makes several important contributions to the literature about nonparametric instrumental variables (NPIV) estimation and inference on a structural function h0 and its functionals. First, we derive sup-norm convergence rates for computationally simple sieve NPIV (series 2SLS) estimators of h0 and its derivatives. Second, we derive a lower bound that describes the best possible (minimax) sup-norm rates of estimating h0 and its derivatives, and show that the sieve NPIV estimator can attain the minimax rates when h0 is approximated via a spline or wavelet sieve. Our optimal sup-norm rates surprisingly coincide with the optimal root-mean-squared rates for severely ill-posed problems, and are only a logarithmic factor slower than the optimal root-mean-squared rates for mildly ill-posed problems. Third, we use our sup-norm rates to establish the uniform Gaussian process strong approximations and the score bootstrap uniform confidence bands (UCBs) for collections of nonlinear functionals of h0 under primitive conditions, allowing for mildly and severely ill-posed problems. Fourth, as applications, we obtain the first asymptotic pointwise and uniform inference results for plug-in sieve t-statistics of exact consumer surplus (CS) and deadweight loss (DL) welfare functionals under low-level conditions when demand is estimated via sieve NPIV. Empiricists could read our real data application of UCBs for exact CS and DL functionals of gasoline demand that reveals interesting patterns and is applicable to other markets.
Categories: Publications

A Note on Optimal Inference in the Linear IV Model

February 11, 2017 - 5:36pm
Donald W. K. Andrews, Vadim Marmer and Zhengfei Yu
This paper considers tests and confidence sets (CS’s) concerning the coefficient on the endogenous variable in the linear IV regression model with homoskedastic normal errors and one right-hand side endogenous variable. The paper derives a finite-sample lower bound function for the probability that a CS constructed using a two-sided invariant similar test has infinite length and shows numerically that the conditional likelihood ratio (CLR) CS of Moreira (2003) is not always "very close" to this lower bound function. This implies that the CLR test is not always very close to the two-sided asymptotically-efficient (AE) power envelope for invariant similar tests of Andrews, Moreira, and Stock (2006) (AMS). On the other hand, the paper establishes the finite-sample optimality of the CLR test when the correlation between the structural and reduced-form errors, or between the two reduced-form errors, goes to 1 or -1 and other parameters are held constant, where optimality means achievement of the two-sided AE power envelope of AMS. These results cover the full range of (non-zero) IV strength. The paper investigates in detail scenarios in which the CLR test is not on the two-sided AE power envelope of AMS. Also, the paper shows via theory and numerical work that the CLR test is close to having greatest average power, where the average is over a grid of concentration parameter values and over pairs alternative hypothesis values of the parameter of interest, uniformly over pairs of alternative hypothesis values and uniformly over the correlation between the structural and reduced-form errors. The paper concludes that, although the CLR test is not always very close to the two-sided AE power envelope of AMS, CLR tests and CS's have very good overall properties.
Categories: Publications