Organoids have great potential as ex vivo disease models for drug discovery and personalized drug screening. Accurate segmentation of individual organoids can provide fundamental indicators of drug response, such as morphology, number, and size. However, for organoids microscopic images data, existing methods cannot automatically and accurately segment organoids due to its problems such as strong adhesion, high background noise, and blurred boundaries. In bridging the gap, we propose a novel unified scheme(PTMBNet) driven by positionable texture and multi-level boundary for achieving accurate organoid segmentation. In particular, we introduce a Texture Positioning Module(TPM) and a Texture Feature Extraction Module(TFM) based on a learnable texture quantification method to capture enhanced texture quantification information and localize segmentation targets under high background noise, respectively. Subsequently, we design a Multi-level Boundary Feature Extraction Module(MBFM) to extract multi-dimensional semantics associated with organoids boundaries. Then, a specially crafted Boundary Restraint Module(BRM) is leveraged to seamlessly extend the positional boundary features to the global context and refine the organoids boundary. Furthermore, we present a Boundary-Texture Consistency loss (BTC) that aims to jointly supervise boundary prediction and texture segmentation outcomes. As part of this study, we manually annotate a substantial and high-quality dataset of lung cancer organoids(LCOs) microscopic images. In comparison to the state-of-the-art methods, the proposed PTMBNet achieves superior segmentation results on the LCOs dataset, with an improvement of 3.4% on Dice and 4.9% on Iou.