Skip to main content

Publications

Faculty:  Learn how to share your research with the Cowles community at the links below.

Abstract

This paper considers tests and confidence sets (CS’s) concerning the coefficient on the endogenous variable in the linear IV regression model with homoskedastic normal errors and one right-hand side endogenous variable. The paper derives a finite-sample lower bound function for the probability that a CS constructed using a two-sided invariant similar test has infinite length and shows numerically that the conditional likelihood ratio (CLR) CS of Moreira (2003) is not always “very close,” say .005 or less, to this lower bound function. This implies that the CLR test is not always very close to the two-sided asymptotically-efficient (AE) power envelope for invariant similar tests of Andrews, Moreira, and Stock (2006) (AMS).

On the other hand, the paper establishes the finite-sample optimality of the CLR test when the correlation between the structural and reduced-form errors, or between the two reduced-form errors, goes to 1 or -1 and other parameters are held constant, where optimality means achievement of the two-sided AE power envelope of AMS. These results cover the full range of (non-zero) IV strength.

The paper investigates in detail scenarios in which the CLR test is not on the two-sided AE power envelope of AMS. Also, theory and numerical results indicate that the CLR test is close to having greatest average power, where the average is over a grid of concentration parameter values and over pairs alternative hypothesis values of the parameter of interest, uniformly over pairs of alternative hypothesis values and uniformly over the correlation between the structural and reduced-form errors. Here, “close” means .015 or less for k≤20, where k denotes the number of IV’s, and .025 or less for 0<k≤40. The paper concludes that, although the CLR test is not always very close to the two-sided AE power envelope of AMS, CLR tests and CS’s have very good overall properties.

Abstract

This paper considers tests and confidence sets (CS’s) concerning the coefficient on the endogenous variable in the linear IV regression model with homoskedastic normal errors and one right-hand side endogenous variable. The paper derives a finite-sample lower bound function for the probability that a CS constructed using a two-sided invariant similar test has infinite length and shows numerically that the conditional likelihood ratio (CLR) CS of Moreira (2003) is not always very close to this lower bound function. This implies that the CLR test is not always very close to the two-sided asymptotically-efficient (AE) power envelope for invariant similar tests of Andrews, Moreira, and Stock (2006) (AMS)

On the other hand, the paper establishes the finite-sample optimality of the CLR test when the correlation between the structural and reduced-form errors, or between the two reduced-form errors, goes to 1 or -1 and other parameters are held constant, where optimality means achievement of the two-sided AE power envelope of AMS. These results cover the full range of (non-zero) IV strength.

The paper investigates in detail scenarios in which the CLR test is not on the two-sided AE power envelope of AMS. Also, the paper shows via theory and numerical work that the CLR test is close to having greatest average power, where the average is over a grid of concentration parameter values and over pairs alternative hypothesis values of the parameter of interest, uniformly over pairs of alternative hypothesis values and uniformly over the correlation between the structural and reduced-form errors. The paper concludes that, although the CLR test is not always very close to the two-sided AE power envelope of AMS, CLR tests and CS’s have very good overall properties.

Abstract

The concept of relative convergence, which requires the ratio of two time series to converge to unity in the long run, explains convergent behavior when series share commonly divergent stochastic or deterministic trend components. Relative convergence of this type does not necessarily hold when series share common time decay patterns measured by evaporating rather than divergent trend behavior. To capture convergent behavior in panel data that do not involve stochastic or divergent deterministic trends, we introduce the notion of weak σ-convergence, whereby cross section variation in the panel decreases over time. The paper formalizes this concept and proposes a simple-to-implement linear trend regression test of the null of no σ-convergence. Asymptotic properties for the test are developed under general regularity conditions and various data generating processes. Simulations show that the test has good size control and discriminatory power. The method is applied to examine whether the idiosyncratic components of 90 disaggregate personal consumption expenditure (PCE) price index items σ-converge over time. We find strong evidence of weak σ-convergence in the period after 1992, which implies that cross sectional dependence has strenthened over the last two decades. In a second application, the method is used to test whether experimental data in ultimatum games converge over successive rounds, again finding evidence in favor of weak σ-convergence. A third application studies convergence and divergence in US States unemployment data over the period 2001-2016.

Abstract

This paper considers estimation and inference concerning the autoregressive coefficient (ρ) in a panel autoregression for which the degree of persistence in the time dimension is unknown. The main objective is to construct confidence intervals for ρ that are asymptotically valid, having asymptotic coverage probability at least that of the nominal level uniformly over the parameter space. It is shown that a properly normalized statistic based on the Anderson-Hsiao IV procedure, which we call the M statistic, is uniformly convergent and can be inverted to obtain asymptotically valid interval estimates. In the unit root case confidence intervals based on this procedure are unsatisfactorily wide and uninformative. To sharpen the intervals a new procedure is developed using information from unit root pretests to select alternative confidence intervals. Two sequential tests are used to assess how close ρ is to unity and to correspondingly tailor intervals near the unit root region. When ρ is close to unity, the width of these intervals shrinks to zero at a faster rate than that of the confidence interval based on the M statistic. Only when both tests reject the unit root hypothesis does the construction revert to the M statistic intervals, whose width has the optimal N^{-1/2}T^{-1/2} rate of shrinkage when the underlying process is stable. The asymptotic properties of this pretest-based procedure show that it produces confidence intervals with at least the prescribed coverage probability in large samples. Simulations confirm that the proposed interval estimation methods perform well in finite samples and are easy to implement in practice. A supplement to the paper provides an extensive set of new results on the asymptotic behavior of panel IV estimators in weak instrument settings.

Abstract

Since the 1972 U.S. Clean Water Act, government and industry have invested over $1 trillion to abate water pollution, or $100 per person-year. Over half of U.S. stream and river miles, however, still violate pollution standards. We use the most comprehensive set of files ever compiled on water pollution and its determinants, including 50 million pollution readings from 170,000 monitoring sites, to study water pollution’s trends, causes, and welfare consequences. We have three main findings. First, water pollution concentrations have fallen substantially since 1972, though were declining at faster rates before then. Second, the Clean Water Act’s grants to municipal wastewater treatment plants caused some of these declines. Third, the grants’ estimated effects on housing values are generally smaller than the grants’ costs.

Abstract

This address considers the epidemiology of narratives relevant to economic fluctuations. The human brain has always been highly tuned towards narratives, whether factual or not, to justify ongoing actions, even such basic actions as spending and investing. Stories motivate and connect activities to deeply felt values and needs. Narratives “go viral” and spread far, even worldwide, with economic impact. The 1920-21 Depression, the Great Depression of the 1930s, the so-called “Great Recession” of 2007-9 and the contentious political-economic situation of today, are considered as the results of the popular narratives of their respective times. Though these narratives are deeply human phenomena that are difficult to study in a scientific manner, quantitative analysis may help us gain a better understanding of these epidemics in the future.

Discussion Paper
Abstract

We study a model of collective reputation and use it to analyze the benefit of collective brands. Consumers form beliefs about the quality of an experience good that is produced by one firm that is part of a collective brand. Consumers’ limited ability to distinguish among firms in the collective and to monitor firms’ investment decisions creates incentives to free-ride on other firms’ investment efforts. Nevertheless, we show that collective brands induce stronger incentives to invest in quality than individual brands under two types of circumstances: if the main concern is with quality control and the baseline reputation of the collective is low, or if the main concern is with the acquisition of specialized knowledge and the baseline reputation of the collective is high. We also contrast the socially optimal information structure with the profit maximizing choice of branding if branding is endogenous. Our results can be applied to country-of-origin, agricultural appellation, and other collective brands.

Abstract

This essay is the third of three. The first is nontechnical and in part autobiograhpical describing the evolution of my approach to developing a microeconomic theory of money and.financial institutions. The second essay was devoted to a more formal sketch of a closed economic exchange system with no other externalities beyond money and markets. This essay builds on the existence of monetary exchange but also context, and active government with nonsymmetric information and many externaties indicate that the views of Keynes, Hayek and Schumpeter are all consistent with the next stages of complexity as the logic requires many different arrays of institutions to provide the necessary economic functions and adjust to the variety of socio-economic contexts.

Abstract

Many centralized school admissions systems use lotteries to ration limited seats at oversubscribed schools. The resulting random assignment is used by empirical researchers to identify the effect of entering a school on outcomes like test scores. I first find that the two most popular empirical research designs may not successfully extract a random assignment of applicants to schools. When do the research designs overcome this problem? I show the following main results for a class of data-generating mechanisms containing those used in practice: One research design extracts a random assignment under a mechanism if and practically only if the mechanism is strategy-proof for schools. In contrast, the other research design does not necessarily extract a random assignment under any mechanism.