Streaming low-rank approximation of tree tensor networks
Alberto Bucci, Gianfranco Verzella
Abstract
Low-rank tensor approximation has emerged as a powerful tool in scientific computing, enabling the efficient handling of large-scale linear and multilinear algebra problems that would otherwise be computationally infeasible with classical methods. By exploiting the inherent low-dimensional structure within high-dimensional data, these techniques reduce storage costs and computational complexity, making it possible to approximate solutions to problems in fields as diverse as quantum physics, machine learning, and computational biology.
Recent advances in randomized techniques for low-rank matrix approximations, including methods like randomized SVD [1] and the generalized Nyström method [2, 3], have paved the way for substantial progress in tensor approximation as well. A range of specialized randomized methods have emerged for tensor decompositions. For instance, the randomized higher-order SVD and its sequential truncated version [4] provide efficient tools for approximating tensors in Tucker format. Likewise, randomized adaptations of TT-SVD [5] extend matrix-based techniques to the tensor train format, enabling the approximation of high-dimensional data while mitigating the curse of dimensionality.
The multilinear Nyström method [6], its sequential counterpart [7], and the streaming tensor train approximation [8] further advance this field, allowing for the streaming low-rank approximation of a given tensor \(\mathcal {A}\) in the Tucker or Tensor-Train format respectively.
Both methods build on the generalized Nyström approach, accessing the tensor \(\mathcal {A}\) exclusively via two-sided random sketches of the original data, making them single-pass and facilitating parallel implementation.
Tucker and tensor train decompositions are specific instances of the more general tree tensor network (TTN) decomposition, where the underlying tree structure takes the form of either a star or chain configuration.
By combining the multilinear Nyström method [6] with the streaming tensor train approximation [8], we introduce the tree tensor network Nyström algorithm [9] (TTNN): a novel approach for the streaming low-rank approximation of tensors in the tree tensor network format. We also introduce a sequential variant of the algorithm that operates on increasingly compressed versions of the tensor, while remarkably preserving streamability. We also provide a detailed analysis of the accuracy of both methods.
However, in practical applications, tensors are often provided in a low-rank TTN format, as working with the full tensor would be computationally prohibitive. In these cases, the challenge lies in achieving further compression or rounding of these representations.
We demonstrate that our algorithm can be readily adapted to this specific setting by leveraging structured embeddings.
Our results indicate that TTNN is capable of achieving nearly optimal approximation error when the sizes of the sketches are appropriately selected. A series of numerical experiments further illustrate the performance of TTNN in comparison to existing deterministic and randomized methods.
References
-
[1] Halko, N., Martinsson, P.G., Tropp, J.A. (2011). Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions.SIAM review, 53(2), pp.217-288.
-
[2] Clarkson K.L., Woodruff, D.P. (2009). Numerical linear algebra in the streaming model. Proceedings of the forty-first annual ACM symposium on Theory of computing, pp.205-214.
-
[3] Nakatsukasa Y. (2020). Fast and stable randomized low-rank matrix approximation. arXiv preprint, arXiv:2009.11392.
-
[4] Ahmadi-Asl, S., Abukhovich, S., Asante-Mensah, M.G., Cichocki, A., Phan, A.H., Tanaka, T. and Oseledets, I. (2021). Randomized algorithms for computation of Tucker decomposition and higher order SVD (HOSVD). IEEE Access, 9, pp.28684-28706.
-
[5] Al Daas, H., Ballard, G., Cazeaux, P., Hallman, E., Miedlar, A., Pasha, M., Reid, T.W. and Saibaba, A.K. (2023). Randomized algorithms for rounding in the tensor-train format. SIAM Journal on Scientific Computing, 45(1), pp.A74-A95.
-
[6] Bucci A., Robol. L. (2024). A multilinear Nyström algorithm for low-rank approximation of tensors in Tucker format. SIAM Journal on Matrix Analysis and Applications, 45(4), pp.1929-1953.
-
[7] Bucci A., Hashemi. B. (2024). A sequential multilinear Nyström algorithm for streaming low-rank approximation of tensors in Tucker format. Applied Mathematics Letters, 159, p.109271.
-
[8] Kressner D., Vandereycken B., Voorhaar R. (2023). Streaming tensor train approximation. SIAM Journal on Scientific Computing, 45(5), pp.A2610-A2631.
-
[9] Bucci A., Verzella G. (2025). Streaming low-rank approximation of tree tensor networks. In preparation.