Skip to main content

Pedro Gozalo Publications

Publish Date
Abstract

We develop kernel-based consistent tests of an hypothesis of additivity in nonparametric regression extending recent work on testing parametric null hypotheses against nonparametric alternatives. The additivity hypothesis is of interest because it delivers interpretability and reasonably fast convergence rates for standard estimators. The asymptotic distributions of the tests under a sequence of local alternatives are found and compared: in fact, we give a ranking of the different tests based on local asymptotic power. The practical performance is investigated via simulations and an application to the German migration data of Linton and Härdle (1996).

Keywords: Additive regression models, Dimensionality reduction, Kernel estimation, Nonparametric regression, Testing

JEL Classification: C12, C13

Abstract

We introduce a new kernel smoother for nonparametric regression that uses prior information on regression shape in the form of a parametric model. In effect, we nonparametrically encompass the parametric model. We derive pointwise and uniform consistency and the asymptotic distribution of our procedure. It has superior performance to the usual kernel estimators at or near the parametric model. It is particularly well motivated for binary data using the probit or logit parametric model as a base. We include an application to the Horowitz (1993) transport choice dataset.