288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

February 2018, Vol. 30, No. 2, Pages 477-504
(doi: 10.1162/neco_a_01035)
© 2018 Massachusetts Institute of Technology
Sufficient Dimension Reduction via Direct Estimation of the Gradients of Logarithmic Conditional Densities
Article PDF (710.56 KB)
Sufficient dimension reduction (SDR) is aimed at obtaining the low-rank projection matrix in the input space such that information about output data is maximally preserved. Among various approaches to SDR, a promising method is based on the eigendecomposition of the outer product of the gradient of the conditional density of output given input. In this letter, we propose a novel estimator of the gradient of the logarithmic conditional density that directly fits a linear-in-parameter model to the true gradient under the squared loss. Thanks to this simple least-squares formulation, its solution can be computed efficiently in a closed form. Then we develop a new SDR method based on the proposed gradient estimator. We theoretically prove that the proposed gradient estimator, as well as the SDR solution obtained from it, achieves the optimal parametric convergence rate. Finally, we experimentally demonstrate that our SDR method compares favorably with existing approaches in both accuracy and computational efficiency on a variety of artificial and benchmark data sets.