摘要
For the past several months, there has been increased interest in the possibilities of using artificially intelligent (AI) chatbots to help craft educational, clinical and scientific documents following the November 2022 realise of the open-source platform, ChatGPT. ChatGPT is a large language model developed by OpenAI that uses machine learning techniques to generate human-like text. It is based on the GPT (Generative Pre-trained Transformer) architecture, which uses deep learning to analyse and understand natural language text. ChatGPT is certainly not the first chatbot supporting healthcare professionals. Xu et al. (2021), in their review of chatbots used in cancer care identified 78 chatbots that were applied to support diagnosis, treatment, monitoring, workflow planning and health promotion. However, unlike previous chatbots that had limited capabilities and specific use, ChatGPT has been trained on a diverse range of texts including web pages, books and conversational data. Consequently, it is highly versatile and can be used for a wide range of areas, including customer support, information retrieval, education and health care (OpenAI, 2022). We decided to examine the platform by asking the AI platform itself (https://chat.openai.com/auth/login) the following questions related to its potential use in nursing practice: Describe the potential use of ChatGPT in nursing practice. Identify the benefits and the limitations. What are the implications of using ChatGPT? The response is shown in Table 1. The generated automated response highlights several possible applications that could support nurses in their clinical practice. According to ChatGPT, it could reduce repetitive writing and administrative work such as summarising long lists of patient information. It could provide case summaries or care plans identifying nursing interventions targeting specific patient needs. It could enhance communication by providing conversation cues between nurses and patients, and it could generate instructions and recommendations that are easier to follow, jargon free and person/patient centred. Other potential applications could be translating medical language into easier to understand text or translating information/instructions directly to the patients' native language. Moreover, complex instructions could be easily and efficiently simplified, thereby possibly increasing patient compliance and adherence. Obviously, these benefits make its use appealing. However, its use is not without limitations and risks. Compassionate and empathic communication is the bases of the nurse–patient relationship. It is possible that overreliance on these chatbots can lead to deskilling nurses. For example, providing prescriptive responses to nurse–patient conversations may make these interactions more impersonal and less therapeutic. Such a concern is supported by Parviainen and Rantala (2022) who argued the use of AI chatbots to provide automated consultations and decision-making can have a profound influence on the nurse–patient relationship, especially regarding its effect on patients' trust. Thus, important questions to consider in relation to its use include ‘Will the patient's trust towards nurses be eroded if they perceive that decisions are being taken by chatbots rather than by human beings?’ ‘Will patients adhere to the recommendations suggested by nurses, if they notice that the decisions are being supported by chatbots?’. Responses given by these chatbots may not be reliable or evidence based. In fact, the OpenAI website acknowledges ChatGPT ‘may occasionally generate incorrect or misleading information and produce offensive or biased content. It is not intended to give advice’. However, as with other chatbots, there is the risk that technologically smart nurses may use this tool without considering its limitations. Doing so could increase the risk of giving inaccurate or biased information to patients or other staff. Since AI robots cannot take responsibility for their responses and cannot be held accountable for its actions or decisions, nurses will remain accountable for their clinical decisions, even if these have been taken based on the chatbots' responses. Currently, the ChatGPT platform advises persons and organisations not to include sensitive, confidential or identifiable information, as it is not able to ensure the confidentiality of the information being shared with it. Additionally, the information processed by ChatGPT is stored temporarily on OpenAI's servers and is not guaranteed to be secure. Therefore, nurses need to take appropriate measures to protect any sensitive or confidential data related to the patient or the healthcare organisation, such as by using encryption or not sharing it online. Moreover, any application needs to conform with the country's data protection regulations. There is still limited evidence about the effectiveness of using chatbots to support clinical practice. Abd-Alrazaq et al. (2020), in their systematic review with meta-analysis of the effectiveness and safety of using chatbots to support interaction and communication and improve mental health, did not provide conclusive evidence. A survey of physicians across the United States to investigate their perceptions about the use of healthcare chatbots found that they were ambivalent about their risks and benefits (Palanica et al., 2019). Moreover, the lack of ability to capture the emotional state of patients and the inability to rely on experience and clinical judgement in decision-making were of major concern to the participants in this study. This lack of clarity about its potential benefits and highlighted risks are significant reasons why healthcare professionals, including nurses, need to be cautious in adopting such technology in clinical practice. Nurses should also be cautious about potentially misleading narratives and exaggerated claims (e.g. ‘revolutionary’ or ‘game changers’) made about such technologies, without critically evaluating these claims (Walker, 2022). Importantly, whilst chatbots may improve nurses' efficiency, nurses must recognise they are just a tool… they cannot replace what nurses can do best – providing the patient a ‘human touch’ and a ‘therapeutic environment’ through their presence and emotional connection. Finally, how well this innovative technology can be integrated into existing healthcare systems, such as electronic health records, remains to be seen, as the communication of healthcare's records with developing technologies is not outstanding. How well the nursing profession adopts the chatbots depends on several factors. Nurses will need to perceive that chatbots such as ChatGPT provide a relative advantage over conventional practices, is compatible with their practice, simple to use, testable and, above all, capable of providing tangible results. Moreover, nurses' familiarity and ease of working with other forms of technology will influence the uptake of chatbots in their practice. Chatbots like ChatGPT are currently limited and require further development to be effectively implemented in supporting the planning and delivery of nursing practice. However, it is expected that more of these generative and large language models will find their way in daily applications and become useful in different contexts including nursing practice. The fact that over a million users have already experimented with the ChatGPT platform in the 5 days after launching (OpenAI, 2022), indicates this innovation is of significant public interest and its potential use is increasing exponentially. Interestingly, a few weeks after ChatGPT was released, Google together with DeepMind announced the release of a medical chatbot Med-PaLM—a similar language model based on medical and research datasets to answer questions related to medical questions and support clinical decision-making. Moreover, Google has also announced they will soon release a ChatGPT rival named Bard. Only time will tell us how such innovative applications (or a significantly better versions), can successfully support nursing practice. Whilst there has been discussion about the possibilities and risks of using AI chatbots in nursing education, there has been limited debate about its applicability to nursing practice. This editorial is one attempt to alert nurses to its potential benefits, limitations and risks. Of critical importance is that nurses are at the table when decisions are made related to such technologies and that nurses acknowledge the crucial role their presence plays in the provision of safe and effective care, irrespective of technologies available to assist them. All authors contributed to the design and/or conduct of the manuscript. All authors approved the current version of the manuscript. None. The authors have no conflicts to declare. No funding was received for this project. No primary data collected.