Throughout our professional history, genetic counselors (GCs) have adeptly adjusted to new technologies and integrating them into clinic for the benefit of patients. Most often, however, the new technology is lab-based. Recent advances in artificial intelligence (AI) technologies have led to the development of large language models (LLMs), like ChatGPT. These advancements bring new considerations into how GCs and the genetic counseling field approach the use and possible integration of another new technology, but this time in the clinical office setting. Genetic counseling could benefit from the integration of AI technology, while also remaining aware of the potential negative consequences. These new capabilities could lead to improvements in patient care while also reducing burn-out and promoting professional growth as patient advocates within the AI space.
The potential of LLMs to integrate into the healthcare setting has been shown by recent studies given they can produce human-like test responses. One study found that AI chatbots, like ChatGPT, could provide high-quality and empathic responses to patient questions, even outperforming physician responses in several aspects[1]. These findings raise the possibility that incorporating AI into genetic counseling could enhance the quality of information provided to patients, improve empathic responses, and support more personalized care. AI-powered apps are already in development to assist cancer patients[2] and to facilitate mental health support[3].
Utilizing LLMs in genetic counseling also has the potential to expand access to services. With just over 5,600 GCs[4] and only 51% working solely in direct patient care[5], the resources of GCs are limited. AI chatbots may be an effective solution for meeting the increasing demand for genetic counseling services. More routine cases, such as basic results counseling, could potentially be handled by AI chatbots, allowing GCs to focus on more complex cases that require their expertise. Ultimately, this could allow for more GCs to practice at the top of their scope, leading to more career fulfilment. LLMs also allow for tailoring of output by tone, language and/or reading level, helping translate complex scientific concepts into responses best suited for that individual patient.
Streamlining workflows for GCs can also be done by automating routine tasks such as gathering patient history, assisting with scheduling, automating note writing, aiding in research, and numerous other tasks GCs perform throughout the day. AI utilization and efficiency would allow GCs to be focused on the more intricate aspects of patient care and research. This reduced administrative load has the potential to decrease burnout and improve job satisfaction, which would help with the overall well-being of GCs and the profession at large.
Although there are numerous benefits of utilizing AI technologies within our field, there are also negative consequences that need to be addressed, including possible job displacement. With any new technology, but especially with the rapid development of AI tools, there is risk that it could lead to job loss. By staying up-to-date on the advancements of AI, GCs and NSGC could help identify and develop complementary skills to navigate the implications for our profession.
There are ethical implications of utilizing AI in genetic counseling, especially regarding the issues of bias and misinformation embedded in the datasets used to build LLMs [6]. GCs are well-positioned from our training and professional expertise to advocate that AI-generated information provide accurate and unbiased information. One possible initiative could include GCs collaborating with AI developers and policymakers to improve algorithms, or to use our expertise to help validate and correct AI-generated responses in the healthcare setting.
Addressing the vast implications of integrating AI technology into genetic counseling cannot rest on individual professionals alone. NSGC plays a vital role as they provide guidance, education, support, and resources to help GCs navigate these changes. NSGC should incorporate AI-focused programs into continuing education opportunities, providing a space for GCs to stay current on the latest AI technologies and understand their applications, limitations, and potential biases.
NSGC should establish a dedicated task force to fully explore the positive and negative impacts of AI in genetic counseling. The task force, in collaboration from experts in the AI community, could help foster communication that would influence the development of more accurate, ethical, and effective AI tools for genetic counseling. There is also a need for developing recommendations for ethical AI policies and guidelines to protect patient privacy and encourage responsible use of AI in genetic counseling. By leading this initiative, the NSGC task force can advocate for patients and the genetic counseling profession.
We are at a pivotal moment with the advancement of technology moving forward in ways that we may not have imagined for our profession. We have a responsibility to our own profession and our patients to not just stay-up-to date on these advancements but to act. This technology is advancing, and we can advance with it, providing our expertise in advocacy and technology to ensure its integration is what is best for our patients and our profession. Otherwise, we risk missing this opportunity and falling behind. The worst case scenario would realize the integration of AI into our field in ways that are unethical and perhaps could even lead to patient harm. NSGC, GCs, AI developers, and policymakers can work together to ensure the AI solutions that are ultimately developed serve all of our patients and serve them equitably.
- Ayers, J.W., et al., Comparing Physician and Artificial Intelligence Chatbot Responses to Patient Questions Posted to a Public Social Media Forum. JAMA Internal Medicine, 2023.
- Belong: Beating Cancer Together. 2023 [cited 2023 5.31.2023]; Available from: https://cancer.belong.life/.
- "Wysa - Everyday Mental Health," Wysa, Jun. 1, 2023. https://www.wysa.com/(accessed June 1, 2023). [cited 2023 5.31.2023]; Available from: https://www.wysa.com/.
- ABGC. [cited 2023 5.31.2023]; Available from: http://www.abgc.net.
- "NSGC > Policy, Research and Publications > Professional Status Survey," Nsgc, May. 25, 2023. https://www.nsgc.org/Policy-Research-and-Publications/Professional-Status-Survey(accessed June 1, 2023). Available from: www.nsgc.org.
- Luccioni, S.M., Margaret; von Werra, Leandro; Kiela, Douwe, Evaluating Language Model Bias with Evaulate. 2022.
Vanessa Nitibhon, MS, CGC is a Sarah Lawrence College graduate based in Portland, OR. She brings over a decade of experience from her work as a clinical genetic counselor at OHSU and New York Presbyterian/Weill-Cornell. Vanessa currently holds the position of Senior MSL with Labcorp. A believer in continual learning, Vanessa stays up-to-date of the latest advancements in the field and aims to support an environment of collaboration, innovation, and growth.