Home
This Title All WIREs
WIREs RSS Feed
How to cite this WIREs title:
WIREs Comp Stat

Nonparametric covariance estimation with shrinkage toward stationary models

Full article on Wiley Online Library:   HTML PDF

Can't access this content? Tell your librarian.

Abstract Estimation of an unstructured covariance matrix is difficult because of the challenges posed by parameter space dimensionality and the positive‐definiteness constraint that estimates should satisfy. We consider a general nonparametric covariance estimation framework for longitudinal data using the Cholesky decomposition of a positive‐definite matrix. The covariance matrix of time‐ordered measurements is diagonalized by a lower triangular matrix with unconstrained entries that are statistically interpretable as parameters for a varying coefficient autoregressive model. Using this dual interpretation of the Cholesky decomposition and allowing for irregular sampling time points, we treat covariance estimation as bivariate smoothing and cast it in a regularization framework for desired forms of simplicity in covariance models. Viewing stationarity as a form of simplicity or parsimony in covariance, we model the varying coefficient function with components depending on time lag and its orthogonal direction separately and penalize the components that capture the nonstationarity in the fitted function. We demonstrate construction of a covariance estimator using the smoothing spline framework. Simulation studies establish the advantage of our approach over alternative estimators proposed in the longitudinal data setting. We analyze a longitudinal dataset to illustrate application of the methodology and compare our estimates to those resulting from alternative models. This article is categorized under: Data: Types and Structure > Time Series, Stochastic Processes, and Functional Data Statistical and Graphical Methods of Data Analysis > Nonparametric Methods Algorithms and Computational Methods > Maximum Likelihood Methods
Covariance Models I–V used for simulation and corresponding estimates with various methods. True covariance structures are shown in the first row followed by their estimates from the oracle estimator, smoothing spline analysis of variance (ANOVA) estimator, parametric polynomial estimator, the sample covariance matrix, the tapered sample covariance matrix, and the soft thresholding estimator
[ Normal View | Magnified View ]
Heatmaps of the true covariance matrices corresponding to Models I–V and ϕ defining the corresponding Cholesky factor T. The smallest elements of each matrix correspond to dark green pixels; the light pink (white) pixels correspond to the large (largest) elements of the matrix
[ Normal View | Magnified View ]
Coordinate transformation of a pair of time points (t, s) for t > s in the left panel to the lag and average in the additive direction, (l, m), in the right panel
[ Normal View | Magnified View ]
The estimated main effect of additive direction m = (t + s)/2 superimposed over the sample regressogram for the cattle data from treatment Group A, and the estimated interaction between lag (l) and additive direction (m) evaluated on the grid defined by the observed time points
[ Normal View | Magnified View ]
Sample regressogram and log innovation variances for the cattle data from treatment Group A. The dashed line in (a) is the polynomial fit (degree 5), and the solid line is the estimated main effect of l = t − s of the cubic smoothing spline fit. The dashed line superimposed over the log innovation variances in (b) is the polynomial fit (degree 3), and the solid line is the cubic smoothing spline fit
[ Normal View | Magnified View ]
Subject‐specific weight curves over time for treatment Groups A and B in panels (a) and (b), respectively
[ Normal View | Magnified View ]

Browse by Topic

Algorithms and Computational Methods > Maximum Likelihood Methods
Statistical and Graphical Methods of Data Analysis > Nonparametric Methods
Data: Types and Structure > Time Series, Stochastic Processes, and Functional Data

Access to this WIREs title is by subscription only.

Recommend to Your
Librarian Now!

The latest WIREs articles in your inbox

Sign Up for Article Alerts