Cannings,, T. I., & Samworth,, R. J. (2017). Random‐projection ensemble classification. Journal of the Royal Statistical Society, Series B: Statistical Methodology, 79(4), 959–1035.
Cook,, R. D. (1996). Graphics for regressions with a binary response. Journal of the American Statistical Association, 91(435), 983–992.
Cook,, R. D. (1998). Principal hessian directions revisited. Journal of the American Statistical Association, 93(441), 84–94.
Cook,, R. D., & Li,, B. (2002). Dimension reduction for conditional mean in regression. The Annals of Statistics, 30(2), 455–474.
Cook,, R. D., & Ni,, L. (2005). Sufficient dimension reduction via inverse regression: A minimum discrepancy approach. Journal of the American Statistical Association, 100(470), 410–428.
Cook,, R. D., & Weisberg,, S. (1991). Sliced inverse regression for dimension reduction: Comment. Journal of the American Statistical Association, 86(414), 328–332.
Fan,, J., & Gijbels,, I. (1996). Local polynomial modelling and its applications: Monographs on statistics and applied probability (Vol. 66). London: Chapman and Hall/CRC.
Friedman,, J. H., & Stuetzle,, W. (1981). Projection pursuit regression. Journal of the American Statistical Association, 76(376), 817–823.
Friedman,, J. H., & Tukey,, J. W. (1974). A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computers, 100(9), 881–890.
Fukumizu,, K., Bach,, F. R., & Jordan,, M. I. (2004). Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces. Journal of Machine Learning Research, 5(Jan), 73–99.
Fukumizu,, K., Bach,, F. R., & Jordan,, M. I. (2009). Kernel dimension reduction in regression. The Annals of Statistics, 37(4), 1871–1905.
Horowitz,, J. L., & Härdle,, W. (1996). Direct semiparametric estimation of single‐index models with discrete covariates. Journal of the American Statistical Association, 91(436), 1632–1640.
Hsing,, T., & Carroll,, R. J. (1992). An asymptotic theory for sliced inverse regression. The Annals of Statistics, 20, 1040–1061.
Kim,, K., Li,, B., Yu,, Z., & Li,, L. (to appear). On post dimension reduction statistical inference. The Annals of Statistics.
Kong,, E., & Xia,, Y. (2012). A single‐index quantile regression model and its estimation. Econometric Theory, 28(4), 730–768.
Kong,, E., & Xia,, Y. (2014). An adaptive composite quantile approach to dimension reduction. The Annals of Statistics, 42(4), 1657–1688.
Li,, K. C. (1991). Sliced inverse regression for dimension reduction. Journal of the American Statistical Association, 86(414), 316–327.
Li,, K. C. (1992). On principal hessian directions for data visualization and dimension reduction: Another application of Stein`s lemma. Journal of the American Statistical Association, 87(420), 1025–1039.
Li,, L. (2007). Sparse sufficient dimension reduction. Biometrika, 94(3), 603–613.
Luo,, W., Li,, B., & Yin,, X. (2014). On efficient dimension reduction with respect to a statistical functional of interest. The Annals of Statistics, 42(1), 382–412.
Ma,, Y., & Zhang,, X. (2015). A validated information criterion to determine the structural dimension in dimension reduction models. Biometrika, 102(2), 409–420.
Ma,, Y., & Zhu,, L. (2012). A semiparametric approach to dimension reduction. Journal of the American Statistical Association, 107(497), 168–179.
Ma,, Y., & Zhu,, L. (2013a). Efficient estimation in sufficient dimension reduction. Annals of Statistics, 41(1), 250–268.
Ma,, Y., & Zhu,, L. (2013b). A review on dimension reduction. International Statistical Review, 81(1), 134–150.
Ma,, Y., & Zhu,, L. (2014). On estimation efficiency of the central mean subspace. Journal of the Royal Statistical Society, Series B: Statistical Methodology, 76(5), 885–901.
Mooij,, J. M., Peters,, J., Janzing,, D., Zscheischler,, J., & Schölkopf,, B. (2016). Distinguishing cause from effect using observational data: Methods and benchmarks. The Journal of Machine Learning Research, 17(1), 1103–1204.
Schott,, J. R. (1994). Determining the dimensionality in sliced inverse regression. Journal of the American Statistical Association, 89(425), 141–148.
Sheng,, W., & Yin,, X. (2016). Sufficient dimension reduction via distance covariance. Journal of Computational and Graphical Statistics, 25(1), 91–104.
Suzuki,, T., and Sugiyama,, M. (2010, March). Sufficient dimension reduction via squared‐loss mutual information estimation. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Chicago, pp. 804–811.
Wang,, H., & Xia,, Y. (2008). Sliced regression for dimension reduction. Journal of the American Statistical Association, 103(482), 811–821.
Wang,, Q., Yin,, X., & Critchley,, F. (2015). Dimension reduction based on the Hellinger integral. Biometrika, 102(1), 95–106.
Wang,, T., & Xia,, Y. (2014). A piecewise single‐index model for dimension reduction. Technometrics, 56(3), 312–324.
Weisberg,, S. (2002). Dimension reduction regression in R. Journal of Statistical Software, 7(1), 1–22.
Xia,, Y. (2007). A constructive approach to the estimation of dimension reduction directions. The Annals of Statistics, 35(6), 2654–2690.
Xia,, Y. (2008). A multiple‐index model and dimension reduction. Journal of the American Statistical Association, 103(484), 1631–1640.
Xia,, Y., Tong,, H., Li,, W. K., & Zhu,, L. X. (2002). An adaptive estimation of dimension reduction space. Journal of the Royal Statistical Society, Series B: Statistical Methodology, 64(3), 363–410.
Yin,, X. (2011). Sufficient dimension reduction in regression. In T. Cai, & X. Shen, (Eds.), Frontiers of Statistics, Vol. 2. High‐dimensional data analysis (pp. 257–273). Singapore: World Scientific.
Yin,, X., & Cook,, R. D. (2002). Dimension reduction for the conditional kth moment in regression. Journal of the Royal Statistical Society, Series B: Statistical Methodology, 64(2), 159–175.
Yin,, X., & Hilafu,, H. (2015). Sequential sufficient dimension reduction for large p, small n problems. Journal of the Royal Statistical Society, Series B: Statistical Methodology, 77(4), 879–892.
Yin,, X., & Li,, B. (2011). Sufficient dimension reduction based on an ensemble of minimum average variance estimators. The Annals of Statistics, 39(6), 3392–3416.
Yin,, X., Li,, B., & Cook,, R. D. (2008). Successive direction extraction for estimating the central subspace in a multiple‐index regression. Journal of Multivariate Analysis, 99(8), 1733–1757.
Zeng,, P., He,, T., & Zhu,, Y. (2012). A lasso‐type approach for estimation and variable selection in single index models. Journal of Computational and Graphical Statistics, 21(1), 92–109.