Skip to main content

Publications

Faculty:  Learn how to share your research with the Cowles community at the links below.

Discussion Paper
Abstract

We consider a nonlinear pricing environment with private information. We provide profit guarantees (and associated mechanisms) that the seller can achieve across all possible distributions of willingness to pay of the buyers. With a constant elasticity cost function, constant markup pricing provides the optimal revenue guarantee across all possible distributions of willingness to pay and the lower bound is attained under a Pareto distribution. We characterize how profits and consumer surplus vary with the distribution of values and show that Pareto distributions are extremal. We also provide a revenue guarantee for general cost functions. We establish equivalent results for optimal procurement policies that support maximal surplus guarantees for the buyer given all possible cost distributions of the sellers.

Econometrica
Abstract

We quantify the distortionary effects of nexus tax laws on Amazon’s distribution net- work investments between 1999 and 2018. We highlight the role of two features of the expansion of Amazon’s network: densification of the network of distribution facilities and vertical integration into package sortation. Densification results in a reduction in the cost of shipping orders, but comes at the expense of higher facility operating costs in more expensive areas and lower scale economies of processing shipments. Nexus laws furthermore generate additional sales tax liabilities as the network grows. Combining data on household spending across online and offline retailers with detailed data on Amazon’s distribution network, we quantify these trade-offs through a static model of demand and a dynamic model of investment. Our results suggest that Amazon’s expansion led to significant shipping cost savings and facilitated the realization of aggregate economies of scale. We find that abolishing nexus tax laws in favor of a non-discriminatory tax policy would induce the company to decentralize its network, lowering its shipping costs. Non-discriminatory taxation would also entail lower revenue, however, as tax-inclusive prices would rise, resulting in a fall in profit overall. This drop and the decline in consumer welfare from higher taxes together fall short of the increases in tax revenue and rival profit, suggesting that the abolishment of nexus laws would lead to an increase in total welfare.

Journal of Econometrics
Abstract

Multicointegration is traditionally defined as a particular long run relationship among variables in a parametric vector autoregressive model that introduces additional coin-tegrating links between these variables and partial sums of the equilibrium errors. This paper departs from the parametric model, using a semiparametric formulation that reveals the explicit role that singularity of the long run conditional covariance matrix plays in determining multicointegration. The semiparametric framework has the advantage that short run dynamics do not need to be modeled and estimation by standard techniques such as fully modified least squares (FM-OLS) on the original I (1) system is straightforward. The paper derives FM-OLS limit theory in the multicointe-grated setting, showing how faster rates of convergence are achieved in the direction of singularity and that the limit distribution depends on the distribution of the conditional one-sided long run covariance estimator used in FM-OLS estimation. Wald tests of restrictions on the regression coefficients have nonstandard limit theory which depends on nuisance parameters in general. The usual tests are shown to be conservative when the restrictions are isolated to the directions of singularity and, under certain conditions, are invariant to singularity otherwise. Simulations show that approximations derived in the paper work well in finite samples. The findings are illustrated empirically in an analysis of fiscal sustainability of the US government over the post-war period.

Discussion Paper
Abstract

We characterize the extreme points of first-order stochastic dominance (FOSD) intervals and show how these intervals are at the heart of many topics in economics. Using knowledge of these extreme points, we characterize the distributions of posterior quantiles under a given prior, leading to an analogue of a classical result regarding the distribution of posterior means. We apply this analogue to various economic subjects, including the psychology of judgement, political economy, and Bayesian persuasion. In addition, FOSD intervals provide a common structure to security design. We use the extreme points to unify and generalize seminal results in that literature when either adverse selection or moral hazard pertains.

Discussion Paper
Abstract

We model an agent who stubbornly underestimates how much his behavior is driven by undesirable motives, and, attributing his behavior to other considerations, updates his views about those considerations. We study general properties of the model, and then apply the framework to identify novel implications of partially naive present bias. In many stable situations, the agent appears realistic in that he eventually predicts his behavior well. His unrealistic self-view does, however, manifest itself in several other ways. First, in basic settings he always comes to act in a more present-biased manner than a sophisticated agent. Second, he systematically mispredicts how he will react when circumstances change, such as when incentives for forward-looking behavior increase or he is placed in a new, ex-ante identical environment. Third, even for physically non-addictive products, he follows empirically realistic addiction-like consumption dynamics that he does not anticipate. Fourth, he holds beliefs that — when compared to those of other agents — display puzzling correlations between logically unrelated issues. Our model implies that existing empirical tests of sophistication in intertemporal choice can reach incorrect conclusions. Indeed, we argue that some previous findings are more consistent with our model than with a model of correctly specified learning.

Discussion Paper
Abstract

We compare influencer marketing to targeted advertising from information aggregation and product awareness perspectives. Influencer marketing leverages network effects by allowing consumers to socially learn from each other about their experienced content utility, but consumers may not know whether to attribute promotional post popularity to high content or high product quality. If the quality of a product is uncertain (e.g., it belongs to an unknown brand), then a mega influencer with consistent content quality fosters more information aggregation than a targeted ad and thereby yields higher profits. When we compare influencer marketing to untargeted ad campaigns or if the product has low quality uncertainty (e.g., belongs to an established brand), then many micro influencers with inconsistent content quality create more consumer awareness and yield higher profits. For products with low quality uncertainty, the firm wants to avoid information aggregation as it disperses posterior beliefs of consumers and leads to fewer purchases at the optimal price. Our model can also explain why influencer campaigns either "go viral" or "go bust," and how for niche products, micro-influencers with consistent content quality can be a valuable marketing tool.

American Economic Review
Abstract

We analyze the consequences of noisy information aggregation for investment. Market imperfections create endogenous rents that cause overinvestment in upside risks and underinvestment in downside risks. In partial equilibrium, these inefficiencies are particularly severe if upside risks are coupled with easy scalability of investment. In general equilibrium, the shareholders' collective attempts to boost value of individual  rms leads to a novel externality operating through price that amplifies investment distortions with downside risks but o sets distortions with upside risks.

Discussion Paper
Abstract

Firms facing complex objectives often decompose the problems they face, delegating different parts of the decision to distinct subordinates. Using comprehensive data and internal models from a large U.S. airline, we establish that airline pricing is inconsistent with canonical dynamic pricing models. However, we show that observed prices can be rationalized as an equilibrium of a game played by departments who each have decision rights for different inputs that are supplied to the observed pricing heuristic. Incorrectly assuming that the firm solves a standard profit maximization problem as a single entity understates overall welfare actually achieved but affects business and leisure consumers differently. Likewise, we show that assuming prices are set through standard profit maximization leads to incorrect inferences about consumer demand elasticities and thus welfare.

Discussion Paper
Abstract

We examine identification of differentiated products demand when one has “micro data” linking the characteristics and choices of individual consumers. Our model nests standard specifications featuring rich observed and unobserved consumer heterogeneity as well as product/market-level unobservables that introduce the problem of econometric endogeneity. Previous work establishes identification of such models using market-level data and instruments for all prices and quantities. Micro data provides a panel structure that facilitates richer demand specifications and reduces requirements on both the number and types of instrumental variables. We address identification of demand in the standard case in which non-price product characteristics are assumed exogenous, but also cover identification of demand elasticities and other key features when these product characteristics are endogenous and not instrumented. We discuss implications of these results for applied work.

Discussion Paper
Abstract

Considerable evidence in past research shows size distortion in standard tests for zero autocorrelation or cross-correlation when time series are not independent identically distributed random variables, pointing to the need for more robust procedures. Recent tests for serial correlation and cross-correlation in Dalla, Giraitis, and Phillips (2022) provide a more robust approach, allowing for heteroskedasticity and dependence in un-correlated data under restrictions that require a smooth, slowly-evolving deterministic heteroskedasticity process. The present work removes those restrictions and validates the robust testing methodology for a wider class of heteroskedastic time series models and innovations. The updated analysis given here enables more extensive use of the methodology in practical applications. Monte Carlo experiments confirm excellent finite sample performance of the robust test procedures even for extremely complex white noise processes. The empirical examples show that use of robust testing methods can materially reduce spurious evidence of correlations found by standard testing procedures.