In this paper, we utilize the edge computing of Jetson Nano J1010 development board and employ MediaPipe technology and acupoint mapping algorithm to display acupoints on the screen. By using a custom deep neural network (DNN) hand recognition model, we achieve augmented reality (AR) interaction with acupoints. Users can view the corresponding acupoint locations on the body and acupoint descriptions through gestures. Additionally, we develop a mobile app that enables interaction with the system via Bluetooth technology. Users can select symptoms they wish to understand, and the system will display acupoints that can alleviate those symptoms. Our team also develops an AI-based diagnostic system using the open-source conversational AI Rasa, which utilizes natural language understanding (NLU) to determine the user's intents from input on the mobile app. When the intent is related to medical content, the information is passed to a BERT model for diagnosis analysis, determining possible diseases or symptoms and providing corresponding recommended acupoints for symptom relief. This aids users in making preliminary judgments about their own conditions. All AI diagnostic information is transmitted to the server-side, where physicians can view user information through a website. By visualizing the data through charts, physicians can quickly understand the user's condition and further save medical resources. This acupoint-based healthcare system offers a safe, convenient, and efficient solution for health management, particularly valuable for individuals lacking experience and medical knowledge.