Aspect-Based Sentiment Analysis (ABSA) is a fine-grained sentiment analysis task that aims to predict sentiment polarity towards a specific aspect occurring in the given sentence. Recently, pre-trained language models such as BERT have shown great progress in this regard. However, due to the mismatch between pre-training and fine-tuning, dealing with informal expressions and complex sentences is facing challenges and it is worthwhile devoting much effort to this. To tackle this, in this paper, we propose a Prompt-oriented Fine-tuning Dual BERT (PFDualBERT) model that considers the complex semantic relevance and the scarce data samples simultaneously. To reduce the impact of such mismatches, we design a ProBERT influenced by the idea of prompt Learning. Specifically, we design a SemBERT module to capture semantic correlations. We refit SemBERT with aspect-based self-attention. The experimental results on three datasets certify that our PFDualBERT model outperforms state-of-the-art methods, and our further analysis substantiates that our model can exhibit stable performance in low-resource environments.