Multimodal Fusion of Brain Networks with Longitudinal Couplings

Wen Zhang, Kai Shu, Suhang Wang, Huan Liu, Yalin Wang


Abstract

In recent years, brain network analysis has attracted considerable interests in the field of neuroimaging analysis. It plays a vital role in understanding biologically fundamental mechanisms of human brains. As the upward trend of multi-source in neuroimaging data collection, effective learning from the different types of data sources, e.g. multimodal and longitudinal data, is much in demand. In this paper, we propose a general coupling framework, the multimodal neuroimaging network fusion with longitudinal couplings (MMLC), to learn the latent representations of brain networks. Specifically, we jointly factorize multimodal networks, assuming a linear relationship to couple network variance across time. Experimental results on two large datasets demonstrate the effectiveness of the proposed framework. The new approach integrates information from longitudinal, multimodal neuroimaging data and boosts statistical power to predict psychometric evaluation measures.


Figures (click on each for a larger version):

Image of Multimodal Fusion of Brain Networks with Longitudinal Couplings
Image of Multimodal Fusion of Brain Networks with Longitudinal Couplings
Image of Multimodal Fusion of Brain Networks with Longitudinal Couplings
Image of Multimodal Fusion of Brain Networks with Longitudinal Couplings
Image of Multimodal Fusion of Brain Networks with Longitudinal Couplings