Acquiring a large volume of annotated medical data is impractical due to time, financial, and legal constraints. Consequently, few-shot medical image segmentation is increasingly emerging as a prominent research direction. Nowadays, Medical scenarios pose two major challenges: (1) intra-class variation caused by diversity among support and query sets; (2) inter-class extreme imbalance resulting from background heterogeneity. However, existing prototypical networks struggle to tackle these obstacles effectively. To this end, we propose a Dual Interspersion and Flexible Deployment (DIFD) model. Drawing inspiration from military interspersion tactics, we design the dual Interspersion module to generate representative basis prototypes from support features. These basis prototypes are then deeply interacted with query features. Furthermore, we introduce a fusion factor to fuse and refine the basis prototypes. Ultimately, we seamlessly integrate and flexibly deploy the basis prototypes to facilitate correct matching between the query features and basis prototypes, thus conducive to improving the segmentation accuracy of the model. Extensive experiments on three publicly available medical image datasets demonstrate that our model significantly outshines other SoTAs (2.78% higher dice score on average across all datasets), achieving a new level of performance. The code is available at: https://github.com/zmcheng9/DIFD.