Nonparametric Regression of Possibly Similar Curves
In many situations it is needed to estimate a set of curves that are believed to be similar in structure. In such case, Ker (2000) suggests the use of external information from the other curves in order to reduce the bias of the standard nonparametric estimator for an individual regression function. In the density case, Ker showed that the inclusion of external data in the estimation of a given density generates sizeable efficiency gains when the different underlying densities are similar. While Ker focuses on bias reduction, Racine and Li (2000) and Altman and Casella (1995) devised estimators that can be used to reduce the variance of the standard nonparametric methods by smoothing across possibly similar curves. All of these techniques have however the same objective: improve on the standard nonparametric estimators. This thesis undertakes Monte Carlo simulations and two empirical applications to evaluate potential gains obtained by using nonparametric techniques that integrate external information. The simulations undertaken show that when the curves are similar in shape, the gains can be enormous: some of these methods outperform the standard nonparametric estimator significantly by reducing its mean integrated squared error by as much as 55 %. The replications also show that if the curves are dissimilar, some of the methods incorporating external data remain competitive to the standard nonparametric estimator.
