Skip to main content

Peter C. B. Phillips Publications

Discussion Paper
Abstract

This paper studies the estimation and inferences in panel threshold regression with unobserved individual-specific threshold effects which is important from the practical perspective and is a distinguishing feature from traditional linear panel data models. It is shown that the within-regime differencing in the static model or the within-regime first-differencing in the dynamic model cannot generate consistent estimators of the threshold, so the correlated random effects models are suggested to handle the endogeneity in such general panel threshold models. We provide a unified estimation and inference framework that is valid for both the static and dynamic models and regardless of whether the unobserved individual-specific threshold effects exist or not. Especially, we propose alternative inference methods for the model parameters, which have better theoretical properties than the existing methods. Simulation studies and an empirical application illustrate the usefulness of our new estimation and inference methodology in practice.

Discussion Paper
Abstract

Price bubbles in multiple assets are sometimes nearly coincident in occurrence. Such near-coincidence is strongly suggestive of co-movement in the associated asset prices and likely driven by certain factors that are latent in the financial or economic system with common effects across several markets. Can we detect the presence of such common factors at the early stages of their emergence? To answer this question, we build a factor model that includes I(1), mildly explosive, and stationary factors to capture normal,  exuberant, and collapsing phases in such phenomena. The I(1) factor models the primary driving force of market fundamentals. The explosive and stationary factors model latent forces that underlie the formation and destruction of asset price bubbles, which typically exist only for subperiods of the sample. The paper provides an algorithm for testing the presence of and date-stamping the origination and termination of price bubbles determined by latent factors in a large-dimensional system embodying many markets. Asymptotics of the bubble test statistic are given under the null of no common bubbles and the alternative of a common bubble across these markets. We prove consistency of a factor bubble detection process for the origination and termination dates of the common bubble. Simulations show good finite sample performance of the testing algorithm in terms of its successful detection rates. Our methods are applied to real estate markets covering 89 major cities in China over the period January 2003 to March 2013. Results suggest the presence of three common bubble episodes in what are known as China’s Tier 1 and Tier 2  cities over the sample period. There appears to be little evidence of a common bubble in Tier 3 cities. 

Discussion Paper
Abstract

Multicointegration is traditionally defined as a particular long run relationship among variables in a parametric vector autoregressive model that introduces links between these variables and partial sums of the equilibrium errors. This paper departs from the parametric model, using a semiparametric formulation that reveals the explicit role that singularity of the long run conditional covariance matrix plays in determining multicointegration. The semiparametric framework has the advantage that short run dynamics do not need to be modeled and estimation by standard techniques such as fully modified least squares (FM-OLS) on the original I(1) system is straightforward. The paper derives FM-OLS limit theory in the multicointegrated setting, showing how faster rates of convergence are achieved in the direction of singularity and that the limit distribution depends on the distribution of the conditional one-sided long run covariance estimator used in FM-OLS estimation. Wald tests of restrictions on the regression coefficients have nonstandard limit theory which depends on nuisance parameters in general. The usual tests are shown to be conservative when the restrictions are isolated to the directions of singularity and, under certain conditions, are invariant to singularity otherwise.  Simulations show that approximations derived in the paper work well in finite samples. We illustrate our findings by analyzing fiscal sustainability of the US  government over the post-war period. 

Discussion Paper
Abstract

Spatial units typically vary over many of their characteristics, introducing potential unobserved heterogeneity which invalidates commonly used homoskedasticity conditions. In the presence of unobserved heteroskedasticity, standard methods based on the (quasi-)likelihood function generally produce inconsistent estimates of both the spatial parameter and the coefficients of the exogenous regressors. A robust generalized method of moments estimator as well as a modified likelihood method have been proposed in the literature to address this issue. The present paper constructs an alternative indirect inference approach which relies on a simple ordinary least squares procedure as its starting point. Heteroskedasticity is accommodated by utilizing a new version of continuous updating that is applied within the indirect inference procedure to take account of the parametrization of the variance-covariance matrix of the disturbances. Finite sample performance of the new estimator is assessed in a Monte Carlo study and found to offer advantages over existing methods. The approach is implemented in an empirical application to house price data in the Boston area, where it is found that spatial effects in house price determination are much more significant under robustification to heterogeneity in the equation errors.

Discussion Paper
Abstract

The Hodrick-Prescott (HP) filter is one of the most widely used econometric methods in applied macroeconomic research. The technique is nonparametric and seeks to decompose a time series into a trend and a cyclical component unaided by economic theory or prior trend specification. Like all nonparametric methods, the HP filter depends critically on a tuning parameter that controls the degree of smoothing. Yet in contrast to modern nonparametric methods and applied work with these procedures, empirical practice with the HP filter almost universally relies on standard settings for the tuning parameter that have been suggested largely by experimentation with macroeconomic data and heuristic reasoning about the form of economic cycles and trends. As recent research \citep{phillips2015business} has shown, standard settings may not be adequate in removing trends, particularly stochastic trends, in economic data. This paper proposes an easy-to-implement practical procedure of iterating the HP smoother that is intended to make the filter a smarter smoothing device for trend estimation and trend elimination. We call this iterated HP technique the \emph{boosted HP filter} in view of its connection to $L_{2}$-boosting in machine learning. The paper develops limit theory to show that the boosted HP (bHP) filter asymptotically recovers trend mechanisms that involve unit root processes, deterministic polynomial drifts, and polynomial drifts with structural breaks, thereby covering the most common trends that appear in macroeconomic data and current modeling methodology. In doing so, the boosted filter provides a new mechanism for consistently estimating multiple structural breaks even without knowledge of the number of such breaks. A stopping criterion is used to automate the iterative HP algorithm, making it a data-determined method that is ready for modern data-rich environments in economic research. The methodology is illustrated using three real data examples that highlight the differences between simple HP filtering, the data-determined boosted filter, and an alternative autoregressive approach. These examples show that the bHP filter is helpful in analyzing a large collection of heterogeneous macroeconomic time series that manifest various degrees of persistence, trend behavior, and volatility. 

Discussion Paper
Abstract

The Hodrick-Prescott (HP) filter is one of the most widely used econometric methods in applied macroeconomic research. The technique is nonparametric and seeks to decompose a time series into a trend and a cyclical component unaided by economic theory or prior trend specification. Like all nonparametric methods, the HP filter depends critically on a tuning parameter that controls the degree of smoothing. Yet in contrast to modern nonparametric methods and applied work with these procedures, empirical practice with the HP filter almost universally relies on standard settings for the tuning parameter that have been suggested largely by experimentation with macroeconomic data and heuristic reasoning about the form of economic cycles and trends. As recent research has shown, standard settings may not be adequate in removing trends, particularly stochastic trends, in economic data. This paper proposes an easy-to-implement practical procedure of iterating the HP smoother that is intended to make the filter a smarter smoothing device for trend estimation and trend elimination. We call this iterated HP technique the boosted HP filter in view of its connection to L_2-boosting in machine learning. The paper develops limit theory to show that the boosted HP filter asymptotically recovers trend mechanisms that involve unit root processes, deterministic polynomial drifts, and polynomial drifts with structural breaks – the most common trends that appear in macroeconomic data and current modeling methodology. In doing so, the boosted filter provides a new mechanism for consistently estimating multiple structural breaks. A stopping criterion is used to automate the iterative HP algorithm, making it a data-determined method that is ready for modern data-rich environments in economic research. The methodology is illustrated using three real data examples that highlight the differences between simple HP filtering, the data-determined boosted filter, and an alternative autoregressive approach. These examples show that the boosted HP filter is helpful in analyzing a large collection of heterogeneous macroeconomic time series that manifest various degrees of persistence, trend behavior, and volatility. 

Discussion Paper
Abstract

While each financial crisis has its own characteristics, there is now widespread recognition that crises arising from sources such as financial speculation and excessive credit creation do inflict harm on the real economy. Detecting speculative market conditions and ballooning credit risk in real time is therefore of prime importance in the complex exercises of market surveillance, risk management, and policy action. This chapter provides an R implementation of the popular real-time monitoring strategy proposed by Phillips, Shi and Yu in the International Economic Review (2015), along with a new bootstrap procedure designed to mitigate the potential impact of heteroskedasticity and to effect family-wise size control in recursive testing algorithms. This methodology has been shown effective for bubble and crisis detection and is now widely used by academic researchers, central bank economists, and fiscal regulators. We illustrate the effectiveness of this procedure with applications to the S&P financial market and the European sovereign debt sector using the psymonitor R package developed in conjunction with this chapter.

Abstract

Causal relationships in econometrics are typically based on the concept of predictability and are established in terms of tests for Granger causality. These causal relationships are susceptible to change, especially during times of financial turbulence, making the real-time detection of instability an important practical issue. This paper develops a test for detecting changes in causal relationships based on a recursive rolling window, which is analogous to the procedure used in recent work on financial bubble detection. The limiting distribution of the test takes a simple form under the null hypothesis and is easy to implement in conditions of homoskedasticity, conditional heteroskedasticity and unconditional heteroskedasticity. Simulation experiments compare the efficacy of the proposed test with two other commonly used tests, the forward recursive and the rolling window tests. The results indicate that both the rolling and the recursive rolling approaches offer good finite sample performance in situations where there are one or two changes in the causal relationship over the sample period, although the performance of the rolling window algorithm seems to be the best. The testing strategies are illustrated in an empirical application that explores the causal impact of the slope of the yield curve on real economic activity in the United States over the period 1985–2013.

Abstract

Expansion and collapse are two key features of a financial asset bubble. Bubble expansion may be modeled using a mildly explosive process. Bubble implosion may take several different forms depending on the nature of the collapse and therefore requires some flexibility in modeling. This paper develops analytics and studies the performance characteristics of the real time bubble monitoring strategy proposed in Phillips, Shi and Yu (2014b,c, PSY) under alternative forms of bubble implosion that can be represented in terms of mildly integrated processes which capture various return paths to market normalcy. We propose a new reverse sample use of the PSY procedure for detecting crises and estimating the date of market recovery. Consistency of the dating estimators is established and the limit theory addresses new complications arising from the alternative forms of bubble implosion and the endogeneity effects present in the reverse regression. Simulations explore the finite sample performance of the strategy for dating market recovery and an illustration to the Nasdaq stock market is provided. A real-time version of the strategy is provided that is suited for practical implementation.

Abstract

This paper provides a novel mechanism for identifying and estimating latent group structures in panel data using penalized regression techniques. We focus on linear models where the slope parameters are heterogeneous across groups but homogenous within a group and the group membership is unknown. Two approaches are considered — penalized least squares (PLS) for models without endogenous regressors, and penalized GMM (PGMM) for models with endogeneity. In both cases we develop a new variant of Lasso called classifier-Lasso (C-Lasso) that serves to shrink individual coefficients to the unknown group-specific coefficients. C-Lasso achieves simultaneous classification and consistent estimation in a single step and the classification exhibits the desirable property of uniform consistency. For PLS estimation C-Lasso also achieves the oracle property so that group-specific parameter estimators are asymptotically equivalent to infeasible estimators that use individual group identity information. For PGMM estimation the oracle property of C-Lasso is preserved in some special cases. Simulations demonstrate good finite-sample performance of the approach both in classification and estimation. An empirical application investigating the determinants of cross-country savings rates finds two latent groups among 56 countries, providing empirical confirmation that higher savings rates go in hand with higher income growth.

Abstract

This paper provides the limit theory of real time dating algorithms for bubble detection that were suggested in Phillips, Wu and Yu (2011, PWY) and Phillips, Shi and Yu (2013b, PSY). Bubbles are modeled using mildly explosive bubble episodes that are embedded within longer periods where the data evolves as a stochastic trend, thereby capturing normal market behavior as well as exuberance and collapse. Both the PWY and PSY estimates rely on recursive right tailed unit root tests (each with a different recursive algorithm) that may be used in real time to locate the origination and collapse dates of bubbles. Under certain explicit conditions, the moving window detector of PSY is shown to be a consistent dating algorithm even in the presence of multiple bubbles. The other algorithms are consistent detectors for bubbles early in the sample and, under stronger conditions, for subsequent bubbles in some cases. These asymptotic results and accompanying simulations guide the practical implementation of the procedures. They indicate that the PSY moving window detector is more reliable than the PWY strategy, sequential application of the PWY procedure and the CUSUM procedure.

Abstract

Identifying and dating explosive bbles when there is periodically collapsing behavior over time has been a major concern in the economics literature and is of great importance for practitioners. The complexity of the nonlinear structure inherent in multiple bubble phenomena within the same sample period makes econometric analysis particularly difficult. The present paper develops new recursive procedures for practical implementation and surveillance strategies that may be employed by central banks and fiscal regulators. We show how the testing procedure and dating algorithm of Phillips, Wu and Yu (2011, PWY) are affected by multiple bubbles and may fail to be consistent. The present paper proposes a generalized version of the sup ADF test of PWY to address this difficulty, derives its asymptotic distribution, introduces a new date-stamping strategy for the origination and termination of multiple bubbles, and proves consistency of this dating procedure. Simulations show that the test significantly improves discriminatory power and leads to distinct power gains when multiple bubbles occur. Empirical applications are conducted to S&P 500 stock market data over a long historical period from January 1871 to December 2010. The new approach identifies many key historical episodes of exuberance and collapse over this period, whereas the strategy of PWY and the CUSUM procedure locate far fewer episodes in the same sample range.

Abstract

Right-tailed unit root tests have proved promising for detecting exuberance in economic and financial activities. Like left-tailed tests, the limit theory and test performance are sensitive to the null hypothesis and the model specification used in parameter estimation. This paper aims to provide some empirical guidelines for the practical implementation of right-tailed unit root tests, focussing on the sup ADF test of Phillips, Wu and Yu (2011), which implements a right-tailed ADF test repeatedly on a sequence of forward sample recursions. We analyze and compare the limit theory of the sup ADF test under different hypotheses and model specifications. The size and power properties of the test under various scenarios are examined in simulations and some recommendations for empirical practice are given. Empirical applications to the Nasdaq and to Australian and New Zealand housing data illustrate these specification issues and reveal their practical importance in testing.