We propose a new adaptive hypothesis test for inequality (e.g., monotonicity, convexity) and equality (e.g., parametric, semiparametric) restrictions on a structural function in a nonparametric instrumental variables (NPIV) model. Our test statistic is based on a modified leave-one-out sample analog of a quadratic distance between the restricted and unrestricted sieve two-stage least squares estimators. We provide computationally simple, data-driven choices of sieve tuning parameters and Bonferroni adjusted chi-squared critical values. Our test adapts to the unknown smoothness of alternative functions in the presence of unknown degree of endogeneity and unknown strength of the instruments. It attains the adaptive minimax rate of testing in L2. That is, the sum of the supremum of type I error over the composite null and the supremum of type II error over nonparametric alternative models cannot be minimized by any other tests for NPIV models of unknown regularities. Confidence sets in L2 are obtained by inverting the adaptive test. Simulations confirm that, across different strength of instruments and sample sizes, our adaptive test controls size and its finite-sample power greatly exceeds existing non-adaptive tests for monotonicity and parametric restrictions in NPIV models. Empirical applications to test for shape restrictions of differentiated products demand and of Engel curves are presented.
A signal is privacy‐preserving with respect to a collection of privacy sets if the posterior probability assigned to every privacy set remains unchanged conditional on any signal realization. We characterize the privacy‐preserving signals for arbitrary state space and arbitrary privacy sets. A signal is privacy‐preserving if and only if it is a garbling of a reordered quantile signal. Furthermore, distributions of posterior means induced by privacy‐preserving signals are exactly mean‐preserving contractions of that induced by the quantile signal. We discuss the economic implications of our characterization for statistical discrimination, the revelation of sensitive information in auctions and price discrimination.
Two information structures are said to be close if, with high probability, there is approximate common knowledge that interim beliefs are close under the two information structures. We define an “almost common knowledge topology” reflecting this notion of closeness. We show that it is the coarsest topology generating continuity of equilibrium outcomes. An information structure is said to be simple if each player has a finite set of types and each type has a distinct first-order belief about payoff states. We show that simple information structures are dense in the almost common knowledge topology and thus it is without loss to restrict attention to simple information structures in information design problems.
We propose a theory of gradualism in the implementation of good policies, suitable for environments featuring time consistency. We downplay the role of the initial period by allowing agents both to wait for future agents to start equilibrium play and to restart the equilibrium by ignoring past history. The allocation gradually transits toward one that weighs both short- and long-term concerns, stopping short of the Ramsey outcome but greatly improving upon Markovian equilibria. We use the theory to account for the slow emergence of both climate policies and the reduction of global tariff rates.
This paper develops a welfare accounting decomposition that identifies and quantifies the origins of welfare gains and losses in general economies with heterogeneous individuals and disaggregated production. The decomposition — exclusively based on preferences and technologies — first separates efficiency from redistribution considerations. Efficiency comprises i) exchange efficiency, which captures allocative efficiency gains due to reallocating consumption and factor supply across individuals, and ii) production efficiency, which captures allocative efficiency gains due to adjusting intermediate inputs and factors, as well as technical efficiency gains from primitive changes in technologies, good endowments, and factor endowments. Leveraging the decomposition, we provide a new characterization of efficiency conditions in disaggregated production economies with heterogeneous individuals that carefully accounts for non-interior solutions, extending classic efficiency results in Lange (1942) or Mas-Colell et al. (1995). In competitive economies, prices (and wedges) are directly informative about the welfare-relevant statistics that shape the welfare accounting decomposition, which allows us to characterize a generalized Hulten’s theorem and a new converse Hulten’s theorem. We present several minimal examples and four applications to workhorse models in macroeconomics: i) the Armington model, ii) the Diamond-Mortensen-Pissarides model, iii) the Hsieh-Klenow model, and iv) the New Keynesian model.
This note shows that the mixed normal asymptotic limit of the trend IV estimator with a fixed number of deterministic instruments (fTIV) holds in both singular (multicointegrated) and nonsingular cointegration systems, thereby relaxing the exogeneity condition in (Phillips and Kheifets, 2024, Theorem 1(ii)). The mixed normality of the limiting distribution of fTIV allows for asymptotically pivotal F tests about the cointegration parameters and for simple efficiency comparisons of the estimators for different numbers K of instruments, as well as comparisons with the trend IV estimator when K → ∞ with the sample size.
Two mechanisms have been proposed to explain sex selection in India: son preference in which parents desire a male heir and daughter aversion in which dowry payments make parents worse off with girls. Our model incorporates both mechanisms, providing micro-foundations, based on the organization of the marriage institution, for daughter aversion. Marital matching, sex selection, and dowries are jointly determined in the model, whose implications are tested on a representative sample of rural households. Simulations of the model indicate that existing policies targeting daughter aversion might exacerbate the problem, while identifying other policies that could be effective.
In GMM estimation, it is well known that if the moment dimension grows with the sample size, the asymptotics of GMM differ from the standard finite dimensional case. The present work examines the asymptotic properties of infinite dimensional GMM estimation when the weight matrix is formed by inverting Brownian motion or Brownian bridge covariance kernels. These kernels arise in econometric work such as minimum Cramer-von Mises distance estimation when testing distributional specification. The properties of GMM estimation are studied under different environments where the moment conditions converge to a smooth Gaussian or non-differentiable Gaussian process. Conditions are also developed for testing the validity of the moment conditions by means of a suitably constructed J-statistic. In case these conditions are invalid we propose another test called the U-test. As an empirical application of these infinite dimensional GMM procedures the evolution of cohort labor income inequality indices is studied using the Continuous Work History Sample database. The findings show that labor income inequality indices are maximized at early career years, implying that economic policies to reduce income inequality should be more effective when designed for workers at an early stage in their career cycles.
There are many economic environments in which an object is offered sequentially to prospective buyers. It is often observed that once the object for sale is turned down by one or more agents, those that follow do the same. One explanation for this phenomenon is that agents making choices further down the line rationally ignore their own assessment of the object and herd behind their predecessors. Our research extends the canonical herding model by allowing agents to differ in their ability to assess the quality of the offered object. We develop novel tests of herding based on this ability heterogeneity and also examine its efficiency consequences, applied to organ transplantation in the U.K. We find that herding is common but that the information lost due to herding does not substantially increase false discards of good organs or false acceptances of bad organs. Our counter-factual analysis indicates that this is due (in part) to the high degree of heterogeneity in ability across transplant centers. In other settings, such as the U.S., where organ transplantation is organized very differently and the ability distribution will not be the same, the inefficiencies due to herding might well be substantial.
This paper considers nonparametric estimation and inference in first-order autoregressive (AR(1)) models with deterministically time-varying parameters. A key feature of the proposed approach is to allow for time-varying stationarity in some time periods, time-varying nonstationarity (i.e., unit root or local-to-unit root behavior) in other periods, and smooth transitions between the two. The estimation of the AR parameter at any time point is based on a local least squares regression method, where the relevant initial condition is endogenous. We obtain limit distributions for the AR parameter estimator and t-statistic at a given point τ in time when the parameter exhibits unit root, local-to-unity, or stationary/stationary-like behavior at time τ. These results are used to construct confidence intervals and median-unbiased interval estimators for the AR parameter at any specified point in time. The confidence intervals have correct asymptotic coverage probabilities with the coverage holding uniformly over stationary and nonstationary behavior of the observations.