Tensor decomposition is an essential technique in high-dimensional data analysis and prediction, serving as a fundamental tool for uncovering the multi-faceted structures inherent in tensor data. Traditional methods like CANDECOMP/PARAFAC (CP) and Tucker decomposition have pioneered this area. However, these methods struggle with the sparsity and noise common in tensor data, and they lack mechanisms to handle the dynamic nature of real-world data, including streaming updates, temporal variations, and function tensors with continuous indices.
This dissertation presents a comprehensive suite of advancements in Bayesian tensor learning that address these challenges across various forms of dynamic tensor data: streaming tensors, temporal tensors, and functional tensors. For streaming tensor data, we introduce the Bayesian Streaming Sparse Tucker Decomposition (BASS) and the Streaming Bayesian Deep Tensor Factorization (SBDT), which both provide efficient and scalable solutions for streaming tensor analysis with sparse machnisms. As for temporal tensors, we present a novel temporal Tucker model: Bayesian Continuous-Time Tucker Decomposition (BCTT), and Streaming Factor Trajectory Learning for Temporal Tensor Decomposition (SFTL), a efficient temporal tensor learning method that can dynamically capture the evolving temporal factors. Further extending the scope to functional tensor, the Functional Bayesian Tucker Decomposition for Continuous-indexed Tensor (FunBAT) adapts tensor decomposition to continuous domains, enabling seamless application to data with continuous indices.
Throughout this dissertation, we will explore how each contribution underpins a robust and scalable Bayesian framework, demonstrating significant real-world implications for handling the complexities of dynamic data across various applications.