Partially Dimension-Reduced Regressions with Potentially Infinite-Dimensional Processes
Regression models sometimes contain a linear parametric part and a part obtained by reducing the dimension of a larger set of data. This paper considers properties of estimates of the interpretable parameters of the model, in a general setting in which a potentially unbounded set of other variables may be relevant, and where the number of included factors or components representing these variables can also grow without bound as sample size increases. We show that consistent (and asymptotically normal, given further restrictions) estimation of a parameter of interest is possible in this setting. We examine selection of the particular orthogonal directions, using a criterion which takes into account both the magnitude of the eigenvalue and the correlation of the eigenvector with the variable of interest. Simulation experiments show that an implementation of this method may have good finite-sample performance.
[ - ]