While Deep Learning methods have been successfully applied to tackle a wide variety of prediction problems, their application has been mostly limited to data structured in a grid-like fashion. However, the study of the human brain "connectome" involves the representation of the brain as a graph with interacting nodes. In this paper, we extend the Graph Attention Network (GAT), a novel neural network (NN) architecture acting on the features of the nodes of a binary graph, to handle a set of graphs provided with node features and non-binary edge weights. We demonstrate the effectiveness of our architecture by training it multimodal data collected from a large homogeneous fMRI dataset (n=1003 individuals with multiple fMRI sessions per subject) made publicly available by the Human Connectome Project (HCP), demonstrating good performance and seamless integration of multimodal neuroimaging data. Our adaptation provides a powerful and flexible deep learning tool to integrate multimodal neuroimaging connectomics data in a predictive context.