Conversational Agents in Palliative Care: Potential Benefits, Risks, and Next Steps
Publication: Journal of Palliative Medicine
Volume 27, Issue Number 3
Abstract
Conversational agents (sometimes called chatbots) are technology-based systems that use artificial intelligence to simulate human-to-human conversations. Research on conversational agents in health care is nascent but growing, with recent reviews highlighting the need for more robust evaluations in diverse settings and populations. In this article, we consider how conversational agents might function in palliative care—not by replacing clinicians, but by interacting with patients around select uncomplicated needs while facilitating more targeted and appropriate referrals to specialty palliative care services. We describe potential roles for conversational agents aligned with the core domains of quality palliative care and identify risks that must be considered and addressed in the development and use of these systems for people with serious illness. With careful consideration of risks and benefits, conversational agents represent promising tools that should be explored as one component of a multipronged approach for improving patient and family outcomes in serious illness.
Get full access to this article
View all available purchase options and get full access to this article.
References
1. Gray N. Gray's Cartoon Featured in AAHPM Quarterly. 2018. Available from: https://medicine.duke.edu/news/grays-cartoon-featured-aahpm-quarterly [Last accessed: August 21, 2023].
2. Finucane AM, O'Donnell H, Lugton J, et al. Digital health interventions in palliative care: A systematic meta-review. NPJ Digit Med 2021;4(1):64;
3. Car LT, Dhinagaran DA, Kyaw BM, et al. Conversational agents in health care: Scoping review and conceptual analysis. J Med Internet Res 2020;22(8):e17158;
4. Kamal AH, Wolf SP, Troy J, et al. Policy changes key to promoting sustainability and growth of the specialty palliative care workforce. Health Aff 2019;38(6):910–918;
5. Lupu D, Quigley L, Mehfoud N, et al. The growing demand for hospice and palliative medicine physicians: Will the supply keep up? J Pain Symptom Manage 2018;55(4):1216–1223;
6. America's Care of Serious Illness: A State-by-State Report Card on Access to Palliative Care in Our Nation's Hospitals. Center to Advance Palliative Care and the National Palliative Care Research Center: New York, NY; 2019.
7. Schenker Y, Althouse AD, Rosenzweig M, et al. Effect of an oncology nurse-led primary palliative care intervention on patients with advanced cancer: The CONNECT Cluster Randomized Clinical Trial. JAMA Intern Med 2021;181(11):1451–1460;
8. Weizenbaum J. ELIZA-A Computer Program for the study of natural language communication between man and machine. Comput Linguist 1966;9:36–45.
9. Laranjo L, Dunn AG, Tong HL, et al. Conversational agents in healthcare: A systematic review. J Am Med Inform Assoc 2018;25(9):1248–1258;
10. Lin X, Martinengo L, Jabir AI, et al. Scope, Characteristics, behavior change techniques, and quality of conversational agents for mental health and well-being: Systematic assessment of apps. J Med Internet Res 2023;25:e45984;
11. Kinsella B. New Data on Voice Assistant SEO Is a Wake-up Call for Brands. 2019. Available from: https://voicebot.ai/2019/07/09/new-data-on-voice-assistant-seo-is-a-wake-up-call-for-brands/ [Last accessed: November 4, 2023].
12. Kannampallil T, Ajilore OA, Lv N, et al. Effects of a virtual voice-based coach delivering problem-solving treatment on emotional distress and brain function: A pilot RCT in depression and anxiety. Transl Psychiatry 2023;13(1):242;
13. National Consensus Project for Quality Palliative Care (Pittsburgh Pa), National Coalition for Hospice and Palliative Care (Richmond Va). Clinical Practice Guidelines for Quality Palliative Care. Richmond, CA; 2018.
14. Ho A, Hancock J, Miner AS. Psychological, relational, and emotional effects of self-disclosure after conversations with a Chatbot. J Commun 2018;68(4):712–733;
15. Bickmore T, Gruber A, Picard R. Establishing the computer-patient working alliance in automated health behavior change interventions. Patient Educ Couns 2005;59(1):21–30;
16. Fitzpatrick KK, Darcy A, Vierhile M. Delivering cognitive behavior therapy to young adults with symptoms of depression and anxiety using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial. JMIR Ment Health 2017;4(2):e19;
17. Schmitz KH, Kanski B, Gordon B, et al. Technology-based supportive care for metastatic breast cancer patients. Support Care Cancer 2023;31(7):401;
18. Schmitz KH, Zhang X, Winkels R, et al. Developing “Nurse AMIE”: A tablet-based supportive care intervention for women with metastatic breast cancer. Psychooncology 2020;29(1):232–236;
19. Gosha K, Porter III J, Cherry D, et al. Spiritual counseling for male college students using embodied conversational agents. J Progress Policy Pract 2014;2(1):123–129.
20. van der Zwaan JM, Geraerts E, Dignum V, et al. User validation of an empathic virtual buddy against cyberbullying. Stud Health Technol Inform 2012;181:243–247.
21. van der Zwaan JM, Dignum V, Jonker CM. Social Support Strategies for Embodied Conversational Agents. Springer International Publishing: Switzerland; 2014; pp. 134–147;
22. Danielescu A, Christian G. A Bot Is Not a Polyglot. In: Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. ACM: New York, NY, USA; 2018; pp. 1–9;.
23. Sedhom R, Shulman LN, Parikh RB. Precision palliative care as a pragmatic solution for a care delivery problem. J Clin Oncol 2023;41(16):2888–2892;
24. Srivastava R. I Am an Oncologist. Can ChatGPT Help Me Deliver Bad News to a Patient? The Guardian; 2023.
25. Tanana MJ, Soma CS, Srikumar V, et al. Development and evaluation of ClientBot: Patient-like conversational agent to train basic counseling skills. J Med Internet Res 2019;21(7):e12529;
26. Amershi S, Weld D, Vorvoreanu M, et al. Guidelines for Human-AI Interaction. In: Conference on Human Factors in Computing Systems—Proceedings Association for Computing Machinery. New York, NY; 2019;.
27. Microsoft Corporation. Responsible Bots: 10 Guidelines for Developers of Conversational AI. Microsoft Corporation: Redmond, WA; 2018.
28. Anonymous. MITRE-Harris Poll Finds Lack of Trust Among Americans in AI Technology. 2023. Available from: https://www.mitre.org/news-insights/news-release/mitre-harris-poll-finds-lack-trust-among-americans-ai-technology [Last accessed: August 20, 2023].
29. May R, Denecke K. Security, privacy, and healthcare-related conversational agents: A scoping review. Inform Health Soc Care 2022;47(2):194–210;
30. Safavi K, Mathews SC, Bates DW, et al. Top-funded digital health companies and their impact on high-burden, high-cost conditions. Health Aff 2019;38(1):115–123;
Information & Authors
Information
Published In
Journal of Palliative Medicine
Volume 27 • Issue Number 3 • March 2024
Pages: 296 - 300
PubMed: 38215235
Copyright
Copyright 2024, Mary Ann Liebert, Inc., publishers.
History
Published in print: March 2024
Published online: 28 February 2024
Published ahead of print: 12 January 2024
Accepted: 19 December 2023
Topics
Authors
Author Disclosure Statement
No competing financial interests exist.
Metrics & Citations
Metrics
Citations
Export Citation
Export citation
Select the format you want to export the citations of this publication.
View Options
Get Access
Access content
To read the fulltext, please use one of the options below to sign in or purchase access.⚠ Society Access
If you are a member of a society that has access to this content please log in via your society website and then return to this publication.