We study the socially optimal level of illiquidity in an economy populated by households with taste shocks and present bias with naive beliefs. The government chooses mandatory contributions to accounts, each with a different pre-retirement withdrawal penalty. Collected penalties are rebated lump sum. When households have homogeneous present bias, β, the social optimum is well approximated by a single account with an early-withdrawal penalty of 1−β. When households have heterogeneous present bias, the social optimum is well approximated by a two-account system: (i) an account that is completely liquid and (ii) an account that is completely illiquid until retirement.
We study the intergenerational effect of education policy on crime. We use Swedish administrative data that links outcomes across generations with crime records, and we show that the comprehensive school reform, gradually implemented between 1949 and 1962, reduced conviction rates both for the generation directly affected by the reform and for their sons. The reduction in conviction rates occurred in many types of crime. The key mediators of this reduction in child generation are an increase in education and household income and a decrease in crime among their fathers.
This paper provides a framework in which a multiproduct ecosystem competes with many single-product firms in both price and innovation. The ecosystem is able to use data collected on one product to improve the quality of its other products. We study the impact of data regulation which either restricts the ecosystem's cross-product data usage, or which requires it to share data with small firms. Each policy induces small firms to innovate more and set higher prices; it also dampens data spillovers within the ecosystem, reduces the ecosystem's incentive to collect data and innovate, and potentially increases its prices. As a result, data regulation has an ambiguous impact on consumers, and is more likely to benefit consumers when small firms are relatively more efficient in innovation. A data cooperative among small firms, which helps them to share data with each other, does not necessarily benefit small firms and can even harm consumers.
We develop an economic framework to analyze the optimal pricing and product design of Large Language Models (LLM). Our framework captures several key features of LLMs: variable operational costs of processing input and output tokens; the ability to customize models through fine-tuning; and high-dimensional user heterogeneity in terms of task requirements and error sensitivity. In our model, a monopolistic seller offers multiple versions of LLMs through a menu of products. The optimal pricing structure depends on whether token allocation across tasks is contractible and whether users face scale constraints. Users with similar aggregate value-scale characteristics choose similar levels of fine-tuning and token consumption. The optimal mechanism can be implemented through menus of two-part tariffs, with higher markups for more intensive users. Our results rationalize observed industry practices such as tiered pricing based on model customization and usage levels.
We consider a seller who offers services to a buyer with multi-unit demand. Prior to the realization of demand, the buyer receives a noisy signal of their future demand, and the seller can design contracts based on the reported value of this signal. Thus, the buyer can contract with the service provider for an unknown level of future consumption, such as in the market for cloud computing resources or software services. We characterize the optimal dynamic contract, extending the classic sequential screening framework to a nonlinear and multi-unit setting. The optimal mechanism gives discounts to buyers who report higher signals, but in exchange they must provide larger fixed payments. We then describe how the optimal mechanism can be implemented by two common forms of contracts observed in practice, the two-part tariff and the committed spend contract. Finally, we use extensions of our base model to shed light on policy-focused questions, such as analyzing how the optimal contract changes when the buyer faces commitment costs, or when there are liquid spot markets.
We study the robust sequential screening problem of a monopolist seller of multiple cloud computing services facing a buyer who has private information about his demand distribution for these services. At the time of contracting, the buyer knows the distribution of his demand of various services and the seller simply knows the mean of the buyer’s total demand. We show that a simple “committed spend mechanism” is robustly optimal: it provides the seller with the highest profit guarantee against all demand distributions that have the known total mean demand. This mechanism requires the buyer to commit to a minimum total usage and a corresponding base payment; the buyer can choose the individual quantities of each service and is free to consume additional units (over the committed total usage) at a fixed marginal price. This result provides theoretical support for prevalent cloud computing pricing practices while highlighting the robustness of simple pricing schemes in environments with complex uncertainty.
In this paper we develop a novel approach to measuring individual welfare within households, recognizing that individuals may have both different preferences (particularly regarding public consumption) and differential access to resources. We construct a money metric measure of welfare that accounts for public goods (by using personalized prices) and the allocation of time. We then use our conceptual framework to analyse intrahousehold inequality in Japan, allowing for the presence of two public goods: expenditures on children and other public goods including housing. We show empirically that women have much stronger preferences for both public goods and this has critical implications for the distribution of welfare in the household.
Welfare depends on the quantity, quality, and range of goods consumed. We use trade data, which report the quantities and prices of the individual goods that countries exchange, to learn about how the gains from trade and growth break down into these different margins. Our general equilibrium model, in which both quality and quantity contribute to consumption and to production, captures (i) how prices increase with importer and exporter per capita income, (ii) how the range of goods traded rises with importer and exporter size, and (iii) how products traveling longer distances have higher prices. Our framework can deliver a standard gravity formulation for total trade flows and for the gains from trade. We find that growth in the extensive margin contributes to about half of overall gains. Quality plays a larger role in the welfare gains from international trade than from economic growth due to selection.
We fully solve a sorting problem with heterogeneous firms and multiple heterogeneous workers whose skills are imperfect substitutes. We show that optimal sorting, which we call mixed and countermonotonic, is comprised of two regions. In the first region, mediocre firms sort with mediocre workers and coworkers such that the output losses are equal across all these teams (mixing). In the second region, a high-skill worker sorts with low-skill coworkers and a high-productivity firm (countermonotonicity). We characterize the equilibrium wages and firm values. Quantitatively, our model can generate the dispersion of earnings within and across US firms.
This paper examines the history of U.S. infrastructure since 1929 and in the process reports an interesting fact about the U.S. economy. Infrastructure stock as a percent of GDP began a steady decline around 1970, and the government budget deficit became positive and large at roughly the same time. The infrastructure pattern in other countries does not mirror that in the United States, so the United States appears to be a special case. The overall results suggest that the United States became less future oriented beginning around 1970, an increase in the social discount rate. This change has persisted. This is the interesting fact. The paper contains speculation on possible causes.