How should a seller offer quantity or quality differentiated products if they have no information about the distribution of demand? We consider a seller who cares about the "profit guarantee" of a pricing rule, that is, the minimum ratio of expected profits to expected social surplus for any distribution of demand.
We show that the profit guarantee is maximized by setting the price markup over cost equal to the elasticity of the cost function. We provide profit guarantees (and associated mechanisms) that the seller can achieve across all possible demand distributions. With a constant elasticity cost function, constant markup pricing provides the optimal revenue guarantee across all possible demand distributions and the lower bound is attained under a Pareto distribution. We characterize how profits and consumer surplus vary with the distribution of values and show that Pareto distributions are extremal. We also provide a revenue guarantee for general cost functions. We establish equivalent results for optimal procurement policies that support maximal surplus guarantees for the buyer given all possible cost distributions of the sellers.
Structural transformation in most currently developing countries takes the form of a rapid rise in services but limited industrialization. In this paper, we propose a new methodology to structurally estimate productivity growth in service industries that circumvents the notorious difficulties in measuring quality improvements. In our theory, the expansion of the service sector is both a consequence—due to income effects—and a cause—due to productivity growth—of the development process. We estimate the model using Indian household data. We find that productivity growth in nontradable consumer services such as retail, restaurants, or residential real estate was an important driver of structural transformation and rising living standards between 1987 and 2011. However, the welfare gains were heavily skewed toward high-income urban dwellers.
We ask how the advertising mechanisms of digital platforms impact product prices. We present a model that integrates three fundamental features of digital advertising markets: (i) advertisers can reach customers on and off-platform, (ii) additional data enhances the value of matching advertisers and consumers, and (iii) bidding follows auction-like mechanisms. We compare data-augmented auctions, which leverage the platform’s data advantage to improve match quality, with managed campaign mechanisms, where advertisers’ budgets are transformed into personalized matches and prices through auto-bidding algorithms. In data-augmented second-price auctions, advertisers increase off-platform product prices to boost their competitiveness on-platform. This leads to socially efficient allocations on-platform, but inefficient allocations off-platform due to high product prices. The platform-optimal mechanism is a sophisticated managed campaign that conditions on-platform prices for sponsored products on off-platform prices set by all advertisers. Relative to auctions, the optimal managed campaign raises off-platform product prices and further reduces consumer surplus.
This paper studies the optimal determination of deposit insurance when bank runs are possible. We show that the welfare impact of changes in the level of deposit insurance coverage can be generally expressed in terms of a small number of sufficient statistics, which include the level of losses in specific scenarios and the probability of bank failure. We characterize the wedges that determine the optimal ex ante regulation, which map to asset- and liability-side regulation. We demonstrate how to employ our framework in an application to the most recent change in coverage in the United States, which took place in 2008.
We present a new class of methods for identification and inference in dynamic models with serially correlated unobservables, which typically imply that state variables are econometrically endogenous. In the context of Industrial Organization, these state variables often reflect econometrically endogenous market structure. We propose the use of Generalized Instrument Variables methods to identify those dynamic policy functions that are consistent with instrumental variable (IV) restrictions. Extending popular “two-step” methods, these policy functions then identify a set of structural parameters that are consistent with the dynamic model, the IV restrictions and the data. We provide computed illustrations to both single-agent and oligopoly examples. We also present a simple empirical analysis that, among other things, supports the counterfactual study of an environmental policy entailing an increase in sunk costs.
This article studies the optimal design of corporate taxes when firms have private information about future investment opportunities and face financial constraints. A government whose goal is to efficiently raise a given amount of revenue from its corporate sector should attempt to tax unconstrained firms, which value resources inside the firm less than financially constrained firms. We show that a corporate payout tax (a tax on dividends and share repurchases) can both separate constrained and unconstrained firms and raise revenue and is therefore optimal. Our quantitative analysis implies that a revenue-neutral switch from profit taxation to payout taxation would increase the overall value of existing firms and new entrants by 7%. This switch could be implemented in the current US tax system by making retained earnings fully deductible.
Abstract
This paper considers a linear panel model with interactive fixed effects and unobserved individual and time heterogeneities that are captured by some latent group structures and an unknown structural break, respectively. To enhance realism the model may have different numbers of groups and/or different group memberships before and after the break. With the preliminary nuclear-norm-regularized estimation followed by row- and column-wise linear regressions, we estimate the break point based on the idea of binary segmentation and the latent group structures together with the number of groups before and after the break by sequential testing K-means algorithm simultaneously. It is shown that the break point, the number of groups and the group member-ships can each be estimated correctly with probability approaching one. Asymptotic distributions of the estimators of the slope coefficients are established. Monte Carlo simulations demonstrate excellent finite sample performance for the proposed estimation algorithm. An empirical application to real house price data across 377 Metropolitan Statistical Areas in the US from 1975 to 2014 suggests the presence both of structural breaks and of changes in group membership.
A general asymptotic theory is established for sample cross moments of nonstationary time series, allowing for long range dependence and local unit roots. The theory provides a substantial extension of earlier results on nonparametric regression that include near-cointegrated nonparametric regression as well as spurious nonparametric regression. Many new models are covered by the limit theory, among which are functional coefficient regressions in which both regressors and the functional covariate are nonstationary. Simulations show finite sample performance matching well with the asymptotic theory and having broad relevance to applications, while revealing how dual nonstationarity in regressors and covariates raises sensitivity to bandwidth choice and the impact of dimensionality in nonparametric regression. An empirical example is provided involving climate data regression to assess Earth’s climate sensitivity to CO2, where nonstationarity is a prominent feature of both the regressors and covariates in the model. This application is the first rigorous empirical analysis to assess nonlinear impacts of CO2 on Earth’s climate.
Firms facing complex objectives often decompose the problems they face, delegating different parts of the decision to distinct sub-units. Using comprehensive data and internal models from a large U.S. airline, we establish that airline pricing is not well approximated by a model of the firm as a unitary decision maker. We show that observed prices, however, can be rationalized by accounting for organizational structure and the decisions by departments that are tasked with supplying inputs to the observed pricing heuristic. Simulating the prices the firm would charge if it were a rational unitary decision-maker results in lower welfare than we estimate under observed practices. Finally, we discuss why counterfactual estimates of welfare and market power may be biased if prices are set through decomposition, but we instead assume that they are set by unitary decision-makers.
We propose a demand estimation method that allows for a large number of zero sale observations, rich unobserved heterogeneity, and endogenous prices. We do so by modeling small market sizes through Poisson arrivals. Each of these arriving consumers solves a standard discrete choice problem. We present a Bayesian IV estimation approach that addresses sampling error in product shares and scales well to rich data environments. The data requirements are traditional market-level data as well as a measure of market sizes or consumer arrivals. After presenting simulation studies, we demonstrate the method in an empirical application of air travel demand.