Factors Predicting Intentions of Adoption and Continued Use of Artificial Intelligence Chatbots for Mental Health: Examining the Role of UTAUT Model, Stigma, Privacy Concerns, and Artificial Intelligence Hesitancy
Publication: Telemedicine and e-Health
Volume 30, Issue Number 3
Abstract
Background: Artificial intelligence-based chatbots (AI chatbots) can potentially improve mental health care, yet factors predicting their adoption and continued use are unclear.
Methods: We conducted an online survey with a sample of U.S. adults with symptoms of depression and anxiety (N = 393) in 2021 before the release of ChatGPT. We explored factors predicting the adoption and continued use of AI chatbots, including factors of the unified theory of acceptance and use of technology model, stigma, privacy concerns, and AI hesitancy.
Results: Results from the regression indicated that for nonusers, performance expectancy, price value, descriptive norm, and psychological distress are positively related to the intention of adopting AI chatbots, while AI hesitancy and effort expectancy are negatively associated with adopting AI chatbots. For those with experience in using AI chatbots for mental health, performance expectancy, price value, descriptive norm, and injunctive norm are positively related to the intention of continuing to use AI chatbots.
Conclusions: Understanding the adoption and continued use of AI chatbots among adults with symptoms of depression and anxiety is essential given that there is a widening gap in the supply and demand of care. AI chatbots provide new opportunities for quality care by supporting accessible, affordable, efficient, and personalized care. This study provides insights for developing and deploying AI chatbots such as ChatGPT in the context of mental health care. Findings could be used to design innovative interventions that encourage the adoption and continued use of AI chatbots among people with symptoms of depression and anxiety and who have difficulty accessing care.
Get full access to this article
View all available purchase options and get full access to this article.
References
1. Anonymous. Any Anxiety Disorder. n.d. Available from: https://www.nimh.nih.gov/health/statistics/any-anxiety-disorder [Last accessed: February 28, 2023].
2. Ettman CK, Abdalla SM, Cohen GH, et al. Prevalence of depression symptoms in US adults before and during the COVID-19 pandemic. JAMA Netw Open 2020;3(9):e2019686;
3. Mohr DC, Ho J, Duffecy J, et al. Perceived barriers to psychological treatments and their relationship to depression. J Clin Psychol 2010;66(4):394–409;
4. Mohr DC, Hart SL, Howard I, et al. Barriers to psychotherapy among depressed and nondepressed primary care patients. Ann Behav Med 2006;32(3):254–258;
5. Rickwood D, Deane FP, Wilson CJ, et al. Young people's help-seeking for mental health problems. Aust E J Adv Ment Health 2005;4(3):218–251;
6. Gorman JM. Comorbid depression and anxiety spectrum disorders. Depress Anxiety 1996;4(4):160–168;
7. Nutt DJ, Ballenger JC, Sheehan D, et al. Generalized anxiety disorder: comorbidity, comparative biology and treatment. Int J Neuropsychopharmacol 2002;5(4):315–325;
8. Graham AK, Greene CJ, Kwasny MJ, et al. Coached mobile app platform for the treatment of depression and anxiety among primary care patients: A randomized clinical trial. JAMA Psychiatry 2020;77(9):906;
9. Haley J. Problem-Solving Therapy: New Strategies for Effective Family Therapy. Jossey-Bass: San Francisco, California; 1977.
10. Hayes SC, Luoma JB, Bond FW, et al. Acceptance and commitment therapy: Model, processes and outcomes. Behav Res Ther 2006;44(1):1–25;
11. Mohr DC, Boudewyn AC, Goodkin DE, et al. Comparative outcomes for individual cognitive-behavior therapy, supportive-expressive group psychotherapy, and sertraline for the treatment of depression in multiple sclerosis. J Consult Clin Psychol 2001;69(6):942–949.
12. Kohn R, Ali AA, Puac-Polanco V, et al. Mental health in the Americas: An overview of the treatment gap. Rev Panam Salud Públic 2018;42:e165;
13. Caldarini G, Jaf S, McGarry K. A literature survey of recent advances in chatbots. Information 2022;13(1):41;
14. Pham KT, Nabizadeh A, Selek S. Artificial intelligence and chatbots in psychiatry. Psychiatr Q 2022;93(1):249–253;
15. Koulouri T, Macredie RD, Olakitan D. Chatbots to support young adults' mental health: An exploratory study of acceptability. ACM Trans Interact Intell Syst 2022;12(2):11:1–11:39;
16. Eliot L. People Are Eagerly Consulting Generative AI ChatGPT For Mental Health Advice, Stressing Out AI Ethics And AI Law. 2023. Available from: https://www.forbes.com/sites/lanceeliot/2023/01/01/people-are-eagerly-consulting-generative-ai-chatgpt-for-mental-health-advice-stressing-out-ai-ethics-and-ai-law [Last accessed: March 1, 2023].
17. van Dis EAM, Bollen J, Zuidema W, et al. ChatGPT: Five priorities for research. Nature 2023;614(7947):224–226;
18. Kissinger H, Schmidt E, Huttenlocher D. ChatGPT Heralds an Intellectual Revolution—WSJ. Wall Str J 2023. Available from: https://www.wsj.com/articles/chatgpt-heralds-an-intellectual-revolution-enlightenment-artificial-intelligence-homo-technicus-technology-cognition-morality-philosophy-774331c6
19. Abd-alrazaq AA, Alajlani M, Alalwan AA, et al. An overview of the features of chatbots in mental health: A scoping review. Int J Med Inf 2019;132:103978;
20. Boucher EM, Harake NR, Ward HE, et al. Artificially intelligent chatbots in digital mental health interventions: A review. Expert Rev Med Devices 2021;18(sup1):37–49;
21. Fitzpatrick KK, Darcy A, Vierhile M. Delivering Cognitive behavior therapy to young adults with symptoms of depression and anxiety using a fully automated conversational agent (Woebot): A randomized controlled trial. JMIR Ment Health 2017;4(2):e19;
22. Abd-Alrazaq AA, Rababeh A, Alajlani M, et al. Effectiveness and safety of using chatbots to improve mental health: Systematic review and meta-analysis. J Med Internet Res 2020;22(7):e16021;
23. Vaidyam AN, Wisniewski H, Halamka JD, et al. Chatbots and conversational agents in mental health: A review of the psychiatric landscape. Can J Psychiatry 2019;64(7):456–464;
24. Malik T, Ambrose AJ, Sinha C. Evaluating user feedback for an artificial intelligence–enabled, cognitive behavioral therapy–based mental health app (Wysa): Qualitative thematic analysis. JMIR Hum Factors 2022;9(2):e35668;
25. Brown JEH, Halpern J. AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare. SSM Ment Health 2021;1:100017;
26. Yuan S, Ma W, Kanthawala S, et al. Keep using my health apps: discover users' perception of health and fitness apps with the UTAUT2 model. Telemed J E Health 2015;21(9):735–741;
27. Bhattacherjee A. Understanding information systems continuance: An expectation-confirmation mode. MIS Q 2001;25(3):351–370;
28. Bhattacherjee A, Lin C-P. A unified model of IT continuance: Three complementary perspectives and crossover effects. Eur J Inf Syst 2015;24(4):364–373;
29. Lazar A, Koehler C, Tanenbaum J, et al. Why We Use and Abandon Smart Devices. In: Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing. UbiComp’15 ACM: New York, NY; 2015; pp. 635–646;
30. Epstein DA, Caraway M, Johnston C, et al. Beyond Abandonment to Next Steps: Understanding and Designing for Life After Personal Informatics Tool Use. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. CHI’16 ACM: New York, NY; 2016; pp. 1109–1113;
31. Epstein DA, Kang JH, Pina LR, et al. Reconsidering the Device in the Drawer: Lapses as a Design Opportunity in Personal Informatics. In: Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing. UbiComp ’16 Association for Computing Machinery: New York, NY, USA; 2016; pp. 829–840;
32. van Bussel MJP, Odekerken–Schröder GJ, Ou C, et al. Analyzing the determinants to accept a virtual assistant and use cases among cancer patients: A mixed methods study. BMC Health Serv Res 2022;22(1):890;
33. Henkel T, Linn AJ, van der Goot MJ. Understanding the Intention to Use Mental Health Chatbots Among LGBTQIA+ Individuals: Testing and Extending the UTAUT. In: Chatbot Research and Design. (Følstad A, Araujo T, Papadopoulos S, et al. eds). Lecture Notes in Computer Science Springer International Publishing: Cham; 2023; pp. 83–100;
34. Gursoy D, Chi OH, Lu L, et al. Consumers acceptance of artificially intelligent (AI) device use in service delivery. Int J Inf Manag 2019;49:157–169;
35. Ashfaq M, Yun J, Yu S, et al. I, Chatbot: Modeling the determinants of users' satisfaction and continuance intention of AI-powered service agents. Telemat Inform 2020;54:101473;
36. Chang I-C, Shih Y-S, Kuo K-M. Why would you use medical chatbots? Interview and survey. Int J Med Inf 2022;165:104827;
37. Terblanche N, Kidd M. Adoption factors and moderating effects of age and gender that influence the intention to use a non-directive reflective coaching chatbot. SAGE Open 2022;12(2):21582440221096136;
38. Prakash AV, Das S, Indian Institute of Technology Kharagpur. Intelligent conversational agents in mental healthcare services: A thematic analysis of user perceptions. Pac Asia J Assoc Inf Syst 2020;12:1–34;
39. Kohnke A, Cole ML, Bush R. Incorporating UTAUT predictors for understanding home care patients' and clinician's acceptance of healthcare telemedicine equipment. J Technol Manag Amp Innov 2014;9(2):29–41;
40. Cimperman M, Makovec Brenčič M, Trkman P. Analyzing older users' home telehealth services acceptance behavior—Applying an Extended UTAUT model. Int J Med Inf 2016;90:22–31;
41. Aggelidis VP, Chatzoglou PD. Using a modified technology acceptance model in hospitals. Int J Med Inf 2009;78(2):115–126;
42. Tavares J, Oliveira T. Electronic Health Record Portal Adoption: A cross country analysis. BMC Med Inform Decis Mak 2017;17(1):97;
43. Damberg S. Predicting future use intention of fitness apps among fitness app users in the United Kingdom: The role of health consciousness. Int J Sports Mark Spons 2021;23(2):369–384;
44. Nadarzynski T, Miles O, Cowie A, et al. Acceptability of artificial intelligence (AI)-led chatbot services in healthcare: A mixed-methods study. Digit Health 2019;5:2055207619871808;
45. Montagni I, Tzourio C, Cousin T, et al. Mental Health-related digital use by university students: A systematic review. Telemed E Health 2020;26(2):131–146;
46. Parker L, Halter V, Karliychuk T, et al. How private is your mental health app data? An empirical study of mental health app privacy policies and practices. Int J Law Psychiatry 2019;64:198–204;
47. Dang Y, Guo S, Guo X, et al. Privacy concerns about health information disclosure in mobile health: Questionnaire study investigating the moderation effect of social support. JMIR MHealth UHealth 2021;9(2):e19594;
48. Mak WWS, Poon CYM, Pun LYK, et al. Meta-analysis of stigma and mental health. Soc Sci Med 2007;65(2):245–261;
49. Schnyder N, Panczak R, Groth N, et al. Association between mental health-related stigma and active help-seeking: Systematic review and meta-analysis. Br J Psychiatry 2017;210(4):261–268;
50. Rickwood DJ, Braithwaite VA. Social-psychological factors affecting help-seeking for emotional problems. Soc Sci Med 1994;39(4):563–572;
51. Carr S. ‘AI gone mental’: Engagement and ethics in data-driven technology for mental health. J Ment Health 2020;29(2):125–130;
52. Miles O, West R, Nadarzynski T. Health chatbots acceptability moderated by perceived stigma and severity: A cross-sectional survey. Digit Health 2021;7:20552076211063012;
53. Venkatesh V, Thong JYL, Xu X. Consumer acceptance and use of information technology: Extending the unified theory of acceptance and use of technology. MIS Quarterly 2012;36(1):157–178;
54. Park HS, Smith SW. Distinctiveness and influence of subjective norms, personal descriptive and injunctive norms, and societal descriptive and injunctive norms on behavioral intent: A case of two behaviors critical to organ donation. Hum Commun Res 2007;33(2):194–218;
55. Vogel DL, Wade NG, Haake S. Measuring the self-stigma associated with seeking psychological help. J Couns Psychol 2006;53(3):325–337;
56. Komiya N, Good GE, Sherrod NB. Emotional openness as a predictor of college students' attitudes toward seeking psychological help. J Couns Psychol 2000;47(1):138–143;
57. Smith HJ, Milberg SJ, Burke SJ. Information Privacy: Measuring individuals' concerns about organizational practices. MIS Q 1996;20(2):167–196;
58. Buchanan T, Paine C, Joinson AN, et al. Development of measures of online privacy concern and protection for use on the Internet. J Am Soc Inf Sci Technol 2007;58(2):157–165;
59. Sinkovics RR, Stöttinger B, Schlegelmilch BB, et al. Reluctance to use technology-related products: Development of a technophobia scale. Thunderbird Int Bus Rev 2002;44(4):477–494;
60. Spitzer RL, Kroenke K, Williams JBW, et al. A brief measure for assessing generalized anxiety disorder: the GAD-7. Arch Intern Med 2006;166(10):1092–1097;
61. Kroenke K, Spitzer RL, Williams JBW. The Patient Health Questionnaire-2: Validity of a two-item depression screener. Med Care 2003;41(11):1284–1292;
62. Beets B, Newman TP, Howell EL, et al. Surveying public perceptions of artificial intelligence in health care in the United States: Systematic review. J Med Internet Res 2023;25(1):e40337;
63. Martinez-Martin N, Kreitmair K. Ethical issues for direct-to-consumer digital psychotherapy apps: Addressing accountability, data protection, and consent. JMIR Ment Health 2018;5(2):e9423;
64. Huckvale K, Nicholas J, Torous J, et al. Smartphone apps for the treatment of mental health conditions: Status and considerations. Curr Opin Psychol 2020;36:65–70;
65. Mollick E. ChatGPT Is a Tipping Point for AI. Harv Bus Rev 2022. Available from: https://hbr.org/2022/12/chatgpt-is-a-tipping-point-for-ai
66. Burgess M. ChatGPT Has a Big Privacy Problem. Wired n.d. Available from: https://www.wired.com/story/italy-ban-chatgpt-privacy-gdpr/
67. Rozado D. The political biases of ChatGPT. Soc Sci 2023;12(3):148;
68. Nielsen M, Levkovich N. COVID-19 and mental health in America: Crisis and opportunity? Fam Syst Health J Collab Fam Healthc 2020;38(4):482–485;
Information & Authors
Information
Published In
Copyright
Copyright 2024, Mary Ann Liebert, Inc., publishers.
History
Published online: 6 March 2024
Published in print: March 2024
Published ahead of print: 27 September 2023
Accepted: 21 August 2023
Revision received: 11 August 2023
Received: 15 June 2023
Topics
- Anxiety disorder
- Artificial intelligence
- Automation
- Cyberpsychology and research
- Cyberpsychology and social networking
- Digital media and internet
- e-health and telehealth care
- Healthcare resources
- Medicine, Surgery & Diagnosis
- Mental disorders
- Nervous system diseases
- Public health care
- Public Health Research and Practice
- Technology, engineering, and computational biology
Authors
Authors' Contributions
L.L. was involved in conceptualization, methodology, formal analysis, investigation, writing—original draft, writing—review and editing, project administration, and funding acquisition. W.P. was involved in conceptualization, methodology, writing—original draft, writing—review and editing, and supervision. M.R. was involved in conceptualization, methodology, formal analysis, writing—review and editing, and funding acquisition.
Disclosure Statement
No competing financial interests exist.
Funding Information
This study is funded by the Charles J. Strosacker Foundation Research Fund for Health and Risk Communication.
Metrics & Citations
Metrics
Citations
Export Citation
Export citation
Select the format you want to export the citations of this publication.
View Options
Access content
To read the fulltext, please use one of the options below to sign in or purchase access.⚠ Society Access
If you are a member of a society that has access to this content please log in via your society website and then return to this publication.