Skip to main content

Publications

Faculty:  Learn how to share your research with the Cowles community at the links below.

Discussion Paper
Abstract

We compare the revenue of the optimal third-degree price discrimination policy against a uniform pricing policy. A uniform pricing policy offers the same price to all segments of the market. Our main result establishes that for a broad class of third-degree price discrimination problems with concave revenue functions and common support, a uniform price is guaranteed to achieve one-half of the optimal monopoly profits. This revenue bound holds for any arbitrary number of segments and prices that the seller would use in case he would engage in third-degree price discrimination. We further establish that these conditions are tight and that a weakening of common support or concavity leads to arbitrarily poor revenue comparisons.

Discussion Paper
Abstract

Beliefs are intuitive if they rely on associative memory, which can be described as a network of associations between events. A belief-theoretic characterization of the model is provided, its uniqueness properties are established, and the intersection with the Bayesian model is characterized. The formation of intuitive beliefs is modelled after machine learning, whereby the network is shaped by past experience via minimization of the difference from an objective probability distribution. The model is shown to accommodate correlation misperception, the conjunction fallacy, base-rate neglect/conservatism, etc.

Discussion Paper
Abstract

This paper discusses some macro links that are missing from trade models. A multicountry macroeconometric model is used to analyze the effects on the United States of increased import competition from China, an experiment that is common in the recent trade literature.  In the macro story a fall in Chinese export prices is stimulative. Domestic prices fall, which increases real wage rates and real wealth, which increases household expenditures.  In addition, the Fed may lower the interest rate because of the lower prices, which is stimulative. Trade models do not have these channels, and they likely overestimate the negative effects or underestimate the positive effects on total output and employment from increased Chinese import competition. They lack some important aggregate demand channels, which are not likely second order. 

Discussion Paper
Abstract

We compare the revenue of the optimal third-degree price discrimination policy against a uniform pricing policy. A uniform pricing policy offers the same price to all segments of the market. Our main result establishes that for a broad class of third-degree price discrimination problems with concave revenue functions and common support, a uniform price is guaranteed to achieve one half of the optimal monopoly profits. This revenue bound obtains for any arbitrary number of segments and prices that the seller would use in case he would engage in third-degree price discrimination. We further establish that these conditions are tight, and that a weakening of common support or concavity leads to arbitrarily poor revenue comparisons.

Discussion Paper
Abstract

The Hodrick-Prescott (HP) filter is one of the most widely used econometric methods in applied macroeconomic research. The technique is nonparametric and seeks to decompose a time series into a trend and a cyclical component unaided by economic theory or prior trend specification. Like all nonparametric methods, the HP filter depends critically on a tuning parameter that controls the degree of smoothing. Yet in contrast to modern nonparametric methods and applied work with these procedures, empirical practice with the HP filter almost universally relies on standard settings for the tuning parameter that have been suggested largely by experimentation with macroeconomic data and heuristic reasoning about the form of economic cycles and trends. As recent research \citep{phillips2015business} has shown, standard settings may not be adequate in removing trends, particularly stochastic trends, in economic data. This paper proposes an easy-to-implement practical procedure of iterating the HP smoother that is intended to make the filter a smarter smoothing device for trend estimation and trend elimination. We call this iterated HP technique the \emph{boosted HP filter} in view of its connection to $L_{2}$-boosting in machine learning. The paper develops limit theory to show that the boosted HP (bHP) filter asymptotically recovers trend mechanisms that involve unit root processes, deterministic polynomial drifts, and polynomial drifts with structural breaks, thereby covering the most common trends that appear in macroeconomic data and current modeling methodology. In doing so, the boosted filter provides a new mechanism for consistently estimating multiple structural breaks even without knowledge of the number of such breaks. A stopping criterion is used to automate the iterative HP algorithm, making it a data-determined method that is ready for modern data-rich environments in economic research. The methodology is illustrated using three real data examples that highlight the differences between simple HP filtering, the data-determined boosted filter, and an alternative autoregressive approach. These examples show that the bHP filter is helpful in analyzing a large collection of heterogeneous macroeconomic time series that manifest various degrees of persistence, trend behavior, and volatility. 

Discussion Paper
Abstract

This paper develops an asymptotic theory for nonlinear cointegrating power function regression. The framework extends earlier work on the deterministic trend case and allows for both endogeneity and heteroskedasticity, which makes the models and inferential methods relevant to many empirical economic and financial applications, including predictive regression. Accompanying the asymptotic theory of nonlinear regression, the paper establishes some new results on weak convergence to stochastic integrals that go beyond the usual semi-martingale structure and considerably extend existing limit theory, complementing other recent findings on stochastic integral asymptotics. The paper also provides a general framework for extremum estimation limit theory that encompasses stochastically nonstationary time series and should be of wide applicability. 

Discussion Paper
Abstract

Multicointegration is traditionally defined as a particular long run relationship among variables in a parametric vector autoregressive model that introduces links between these variables and partial sums of the equilibrium errors. This paper departs from the parametric model, using a semiparametric formulation that reveals the explicit role that singularity of the long run conditional covariance matrix plays in determining multicointegration. The semiparametric framework has the advantage that short run dynamics do not need to be modeled and estimation by standard techniques such as fully modified least squares (FM-OLS) on the original I(1) system is straightforward. The paper derives FM-OLS limit theory in the multicointegrated setting, showing how faster rates of convergence are achieved in the direction of singularity and that the limit distribution depends on the distribution of the conditional one-sided long run covariance estimator used in FM-OLS estimation. Wald tests of restrictions on the regression coefficients have nonstandard limit theory which depends on nuisance parameters in general. The usual tests are shown to be conservative when the restrictions are isolated to the directions of singularity and, under certain conditions, are invariant to singularity otherwise.  Simulations show that approximations derived in the paper work well in finite samples. We illustrate our findings by analyzing fiscal sustainability of the US  government over the post-war period. 

Discussion Paper
Abstract

We propose three new methods of inference for the threshold point in endogenous threshold regression and two specification tests designed to assess the presence of endogeneity and threshold effects without necessarily relying on instrumentation of the covariates. The first inferential method is a parametric two-stage least squares method and is suitable when instruments are available. The second and third methods are based on smoothing the objective function of the integrated difference kernel estimator in different ways and these methods do not require instrumentation. All three methods are applicable irrespective of endogeneity of the threshold variable. The two specification tests are constructed using a score-type principle. The threshold effect test extends conventional parametric structural change tests to the nonparametric case. A wild bootstrap procedure is suggested to deliver finite sample critical values for both tests. Simulations show good finite sample performance of these procedures and the methods provide flexibility in testing and inference for practitioners working with threshold models.