Pre-trained language models (LMs) have been widely used in sentiment analysis, and some recent works have focused on injecting sentiment knowledge from sentiment lexicons or structured commonsense knowledge from knowledge graphs (KGs) into pre-trained LMs, which have achieved remarkable success. However, these works often only obtain knowledge from a single source in either the sentiment lexicon or the KG, and only perform very shallow fusion of LM representations and external knowledge representations. Therefore, how to effectively extract multiple sources of external knowledge and fully integrate them with the LM representations is still an unresolved issue. In this paper, we propose a novel knowledge enhanced model for sentiment analysis (KSA), which simultaneously incorporates commonsense and sentiment knowledge as external knowledge, by constructing a heterogeneous Commonsense-Senti Knowledge Graph. Additionally, a separate global token and global node are added to the text sequence and constructed knowledge graph respectively, and a fusion unit is used to enable global information interaction between the different modalities, allowing them to perceive each other’s information and thereby improving the ability to perform sentiment analysis. Experiments on standard datasets show that our proposed KSA significantly outperforms the strong pre-trained baselines, and achieves new state-of-the-art results on most of the test datasets.