Open Access: Adversarially robust subspace learning in the spiked covariance model

Each week, we select a recently published Open Access article to feature. This week’s article comes from Statistical Analysis and Data Mining and studies robust subspace learning. 

The article’s abstract is given below, with the full article available to read here.

Sha, F.Zhang, R.Adversarially robust subspace learning in the spiked covariance modelStat. Anal. Data Min.: ASA Data Sci. J.. (2022), 1– 10
We study the problem of robust subspace learning when there is an adversary who can attack the data to increase the projection error. By deriving the adversarial projection risk when data follows the multivariate Gaussian distribution with the spiked covariance, or so-called the Spiked Covariance model, we propose to use the empirical risk minimization method to obtain the optimal robust subspace. We then find a non-asymptotic upper bound of the adversarial excess risk, which implies the empirical risk minimization estimator is close to the optimal robust adversarial subspace. The optimization problem can be solved easily by the projected gradient descent algorithm for the rank-one spiked covariance model. However, in general, it is computationally intractable to solve the empirical risk minimization problem. Thus, we propose to minimize an upper bound of the empirical risk to find the robust subspace for the general spiked covariance model. Finally, we conduct numerical experiments to show the robustness of our proposed algorithms.
More Details