Randomized Rank-Revealing QLP for Low-Rank Matrix Decomposition

The pivoted QLP decomposition is computed through two consecutive pivoted QR decompositions. It is an approximation to the computationally prohibitive singular value decomposition (SVD). This work is concerned with a partial QLP decomposition of matrices through the exploitation of random sampling. The method presented is tailored for low-rank matrices and called Randomized Unpivoted QLP (RU-QLP). Like pivoted QLP, RU-QLP is rank-revealing and yet it utilizes randomized column sampling and the unpivoted QR decomposition. The latter modifications allow RU-QLP to be highly scalable and parallelizable on advanced computational platforms. We provide an analysis for RU-QLP, thereby bringing insights into its characteristics and performance behavior. In particular, we derive bounds in terms of both spectral and Frobenius norms on: i) the rank-revealing property; ii) principal angles between approximate subspaces and exact singular subspaces and vectors; and iii) the errors of low-rank approximations. Effectiveness of the bounds is illustrated through numerical tests. We further use a modern, multicore machine equipped with a GPU to demonstrate the efficiency of RU-QLP. Our results show that compared to the randomized SVD, RU-QLP achieves a speedup of up to 7.1 and 8.5 times using the CPU and up to 2.3 and 5.8 times using the GPU for the decomposition of dense and sparse matrices, respectively.

View this article on IEEE Xplore