Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Data Mining Knowl Discov
Impact Factor: 2.111

Multilinear and nonlinear generalizations of partial least squares: an overview of recent advances

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Partial least squares (PLS) is an efficient multivariate statistical regression technique that has shown to be particularly useful for analysis of highly collinear data. To predict response variables Y based independent variables X, PLS attempts to find a set of common orthogonal latent variables by projecting both X and Y onto a new subspace respectively. As an increasing interest in multi‐way analysis, the extension to multilinear regression model is also developed with the aim to analyzing two‐multidimensional tensor data. In this article, we overview the PLS‐related methods including linear, multilinear, and nonlinear variants and discuss the strength of the algorithms. As canonical correlation analysis (CCA) is another similar technique with the aim to extract the most correlated latent components between two datasets, we also briefly discuss the extension of CCA to tensor space. Finally, several examples are given to compare these methods with respect to the regression and classification techniques.

Conflict of interest: The authors have declared no conflicts of interest for this article.

The PLS model: data decomposition as a sum of rank‐one matrices.
[ Normal View | Magnified View ]
Visualization of test dataset in two‐dimensional kernel‐based tensor canonical correlation analysis (KTCCA) latent space. Observe that the first two components obtained from KTCCA are discriminative for action classification.
[ Normal View | Magnified View ]
Three examples of video sequences in tensor form for H‐W, H‐C, and walking actions.
[ Normal View | Magnified View ]
The prediction performance for three‐dimensional (3D) movement trajectories recorded from Elbow, Wrist, and Hand using four regression models including linear partial least squares (LP), higher‐order partial least squares (HP), kernel‐based tensor partial least squares (KTPLS) with Chordal distance based kernel (KT‐1) and KTPLS with KL divergence‐based kernel (KT‐2). The correlation coefficients r2 between prediction and real data shown in (a) indicates that the best performance is obtained by TK‐1, while evaluation of Q2=1y^y2/y2 showed in (b) indicates that TK‐2 outperforms the other methods.
[ Normal View | Magnified View ]
Visualization of higher‐order partial least squares (HOPLS) model for X_ decomposition. (a) Spatial loadings Pr1 corresponding to the first five latent vectors. Each row shows five significant loading vectors. Likewise, panel (b) depicts time‐frequency loadings Pr2, with β and γ‐band exhibiting significant contribution.
[ Normal View | Magnified View ]
The scheme for decoding of three‐dimensional (3D) hand movement trajectories from electrocorticography (ECoG) signals.
[ Normal View | Magnified View ]
Schematic diagram of the higher‐order partial least squares (HOPLS) model: approximating X_ as a sum of rank‐(1,L2,L3) tensors. Approximation for Y_ follows a similar principle with shared common latent components T.
[ Normal View | Magnified View ]
The N‐way partial least squares (N‐PLS) model: data decomposition as a sum of rank‐one tensors and a sum of rank‐one matrices.
[ Normal View | Magnified View ]

Browse by Topic

Technologies > Machine Learning
Technologies > Prediction

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts