Breast cancer continues to be a prominent contributor to female mortality. Ultrasound imaging stands as a widely utilized technique for detecting breast abnormalities. In this paper, we introduce a novel two-stage neural network model to classify breast cancer in ultrasound images. In the first stage, we employ a fully convolutional network (FCN) to perform image segmentation. The FCN learns to predict segmentation masks from the breast ultrasound images, delineating tumor regions. Subsequently, the second stage involves a convolutional neural network (CNN) to classify tumor type, leveraging tumor masks generated by the first stage and the original ultrasound images. Results showcase the added value of the two-stage approach, with our proposed model achieving a classification accuracy of 92.41 %, consistently surpassing the performance of baseline models that rely solely on CNNs for breast ultrasound image classification.