288 pp. per issue
6 x 9, illustrated
2014 Impact factor:

Neural Computation

January 2017, Vol. 29, No. 1, Pages 247-262
(doi: 10.1162/NECO_a_00901)
© 2016 Massachusetts Institute of Technology
LSV-Based Tail Inequalities for Sums of Random Matrices
Article PDF (135.22 KB)

The techniques of random matrices have played an important role in many machine learning models. In this letter, we present a new method to study the tail inequalities for sums of random matrices. Different from other work (Ahlswede & Winter, 2002; Tropp, 2012; Hsu, Kakade, & Zhang, 2012), our tail results are based on the largest singular value (LSV) and independent of the matrix dimension. Since the LSV operation and the expectation are noncommutative, we introduce a diagonalization method to convert the LSV operation into the trace operation of an infinitely dimensional diagonal matrix. In this way, we obtain another version of Laplace-transform bounds and then achieve the LSV-based tail inequalities for sums of random matrices.