Sohail Verma,
Dr Pretty Bhalla,
Simran Monga,
- Assistant Professor, Department of Management, Guru Kashi University, Punjab, India
- Professor, Department of Management, Lovely Professional University, Punjab, India
- Student, ³Department of Management, Guru Kashi University, Punjab, India
Abstract
Artificial Intelligence has rapidly evolved into a formidable instrument within the domain of mental healthcare, fundamentally altering the way we understand awareness, diagnosis, intervention and emotional regulation. This narrative review explores AI’s potential to foster positive mental health through tools such as natural language processing, machine learning, deep learning and computer vision. These technologies promise earlier detection of mental disorders, customized treatment plans and responsive emotional support. Yet, alongside these possibilities arise profound concerns issues of data integrity, algorithmic bias, ethical ambiguity and cultural blindness remain unresolved. At the core of this discussion lies a fundamental principle: AI must serve as a collaborator, not a replacement, for human judgment. The review emphasizes the indispensable role of transparency, inclusivity in data training and ethical stewardship in deploying AI responsibly. Going forward, research must prioritize the refinement of interpretability, the elimination of systemic bias and the integration of diverse cultural frameworks to truly realize AI’s potential in advancing global mental health care.
Keywords: artificial intelligence, mental healthcare, digital mental health, ethical AI, AI in healthcare
[This article belongs to Current Trends in Signal Processing ]
Sohail Verma, Dr Pretty Bhalla, Simran Monga. Digital Psychiatry: A Narrative Review on AI Positive Role in Mental Health. Current Trends in Signal Processing. 2025; 15(03):1-13.
Sohail Verma, Dr Pretty Bhalla, Simran Monga. Digital Psychiatry: A Narrative Review on AI Positive Role in Mental Health. Current Trends in Signal Processing. 2025; 15(03):1-13. Available from: https://journals.stmjournals.com/ctsp/article=2025/view=215314
References
- G. Solomonoff, “Ray Solomonoff and the Dartmouth summer research project in artificial intelligence (1956),” unpublished manuscript, 2017.
- J. Kaplan, Artificial Intelligence: What Everyone Needs to Know. New York, NY: Oxford University Press, 2016.
- J. McCarthy, “Making robots conscious of their mental states,” presented at Machine Intelligence 15 Workshop, Oxford University, UK, 1995.
- W. Ertel, N. Black and F. Mast, Introduction to Artificial Intelligence. Cham, Switzerland: Springer, 2017.
- P. McCorduck, Machines Who Think: A Personal Inquiry into the History and Prospects of Artificial Intelligence. Boca Raton, FL: CRC Press, 2004.
- P. Wang, “On defining artificial intelligence,” J. Artif. Gen. Intell., vol. 10, no. 2, pp. 1-37, 2019, doi: 10.2478/jagi-2019- 0002.
- R. Anyoha, “The history of artificial intelligence,” Science in the News, Harvard University, 2017. [Online]. Available: https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence. [Accessed: 24-Sep-2023].
- J. Moor, The Turing Test: The Elusive Standard of Artificial Intelligence, vol. 30. Norwell, MA: Springer Science & Business Media, 2003.
- J. McCarthy, “Programs with common sense,” Mechanization of Thought Processes, vol. I. London: Her Majesty’s Stationery Office, UK, 1959.
- A. Newell, J. C. Shaw and H. A. Simon, “Elements of a theory of human problem solving,” Psychol. Rev., vol. 65, pp. 151-166, 1958, doi: 10.1037/h0048495.
- E. M. Feigenbaum and M. C. Corduck, The Fifth Generation. New York: Addison-Wesley, 1983.
- S. J. Russell and P. Norvig, Artificial Intelligence: A Modern Approach. London: Pearson Education Limited, 2016.
- B. LeCun, T. Mautor, F. Quessette and M. A. Weisser, “Bin packing with fragmentable items: Presentation and approximations,” Theor. Comput. Sci., vol. 602, pp. 50-59, 2015, doi: 10.1016/j.tcs.2015.08.005.
- I. Goodfellow, Y. Bengio and A. Courville, Deep Learning. Cambridge, MA: MIT Press, 2016.
- S. Kelly, S. Kaye and O. Trespalacios, “A multi-industry analysis of the future use of AI chatbots,” Hum. Behav. Emerg. Technol., vol. 2022, p. 14, 2022, doi: 10.1155/2022/2552099.
- C. Dirican, “The impacts of robotics, artificial intelligence on business and economics,” Procedia Soc. Behav. Sci., vol. 195, pp. 564-573, 2015, doi: 10.1016/j.sbspro.2015.06.134.
- A. Rajkomar et al., “Scalable and accurate deep learning with electronic health records,” NPJ Digit. Med., vol. 1, p. 18, 2018, doi: 10.1038/s41746-018-0029-1.
- F. Mumali, “Artificial neural network-based decision support systems in manufacturing processes: A systematic literature review,” Comput. Ind. Eng., vol. 165, p. 107964, 2022, doi: 10.1016/j.cie.2022.107964.
- G. Litjens, B. B. Kausika, E. Worrell and W. van Sark, “Spatial analysis of residential combined photovoltaic and battery potential: Case study Utrecht, The Netherlands,” in IEEE 44th Photovoltaic Specialist Conference (PVSC), 2017, pp. 3014-3019.
- T. H. Davenport, “Artificial intelligence for the real world,” Harvard Bus. Rev., 2018. [Online]. Available: https://hbr.org/webinar/2018/02/artificial-intelligence-for-the-real-world. [Accessed: 02-Oct-2023].
- M. Mitchell, Artificial Intelligence: A Guide for Thinking Humans. UK: Penguin Random House, 2019, p. 581, doi: 10.1007/s10710-022-09439-7.
- D. Becker, “Possibilities to improve online mental health treatment: Recommendations for future research and developments,” in Future Inf. Commun. Conf., Singapore: Springer, 2018.
- B. Chen, R. Kwiatkowski, C. Vondrick and H. Lipson, “Full-body visual self-modeling of robot morphologies,” Sci. Robot., vol. 7, no. 68, p. eabn1944, 2022, doi: 10.1126/scirobotics.abn1944.
- Accenture, “From AI compliance to competitive advantage,” 2022. [Online]. Available: https://www.accenture.com/us- en/insights/artificialintelligence/ai-compliance-competitive-advantage. [Accessed: 21-Dec-2023].
- S. Graham et al., “Artificial intelligence for mental health and mental illnesses: An overview,” Curr. Psychiatry Rep., vol. 21, pp. 1-18, 2019, doi: 10.1007/s11920-019-1094-0.
- D. Raphael-Rene and J. Name, “Artificial intelligence for competitive advantage: Grata,” Grata Software, 2023. [Online]. Available: https://www.gratasoftware.com/artificial-intelligence-for-competitive-advantage-insights-for-business-leaders. [Accessed: 21-Dec-2023].
- F. Fabris, J. P. D. Magalh√£es and A. A. Freitas, “A review of supervised machine learning applied to ageing research,” Biogerontology, vol. 18, pp. 171-188, 2017, doi: 10.1007/s10522-017-9683-y.
- D. Bzdok, M. Krzywinski and N. Altman, “Machine learning: Supervised methods,” Nat. Methods, vol. 15, p. 5, 2018, doi: 10.1038/nmeth.4551.
- R. Miotto, L. Li, B. A. Kidd and J. T. Dudley, “Deep patient: An unsupervised representation to predict the future of patients from the electronic health records,” Sci. Rep., vol. 6, p. 26094, 2016, doi: 10.1038/srep26094.
- Y. LeCun, Y. Bengio and G. Hinton, “Deep learning,” Nature, vol. 521, pp. 436-444, 2015, doi: 10.1038/nature14539.
- R. Miotto, F. Wang, S. Wang, X. Jiang and J. T. Dudley, “Deep learning for healthcare: Review, opportunities and challenges,” Brief Bioinformatics, vol. 19, pp. 1236-1246, 2018, doi: 10.1093/bib/bbx044.
- O. Faust, Y. Hagiwara, T. J. Hong, O. S. Lih and U. R. Acharya, “Deep learning for healthcare applications based on physiological signals: A review,” Comput. Methods Programs Biomed., vol. 161, pp. 1-13, 2018, doi: 10.1016/j.cmpb.2018.04.005.
- J. Hirschberg and C. D. Manning, “Advances in natural language processing,” Science, vol. 349, pp. 261-266, 2015, doi: 10.1126/science.aaa8685.
- O. Gottesman et al., “Guidelines for reinforcement learning in healthcare,” Nat. Med., vol. 25, pp. 16-18, 2019, doi: 10.1038/s41591-018-0310-5.
- S. M. Ahmed, S. Lohit, K. C. Peng, M. J. Jones and A. K. Roy-Chowdhury, “Cross-modal knowledge transfer without task-relevant source data,” Eur. Conf. Comput. Vis., Switzerland: Springer Nature, 2022, pp. 111-127.
- D. Mowery et al., “Understanding depressive symptoms and psychosocial stressors on Twitter: A corpus-based study,” J. Med. Internet Res., vol. 19, p. e6895, 2017, doi: 10.2196/jmir.6895.
- F. Minerva and A. Giubilini, “Is AI the future of mental healthcare?” Topoi, vol. 42, pp. 1-9, 2023, doi: 10.1007/s11245- 023-09932-3.
- S. Tutun et al., “An AI-based decision support system for predicting mental health disorders,” Inf. Syst. Front., vol. 25, pp. 1261-1276, 2023, doi: 10.1007/s10796-022-10282-5.
- A. N. Vaidyam et al., “Chatbots and conversational agents in mental health: A review of the psychiatric landscape,” Can. J. Psychiatry, vol. 64, pp. 456-464, 2019, doi: 10.1177/0706743719828977.
- K. Denecke, A. Abd-Alrazaq and M. Househ, “Artificial intelligence for chatbots in mental health: Opportunities and challenges,” in Multiple Perspectives on Artificial Intelligence in Healthcare: Opportunities and Challenges, M. Househ, E. Borycki and A. Kushniruk, Eds., Cham, Switzerland: Springer International Publishing, 2021, pp. 115-128.
- S. Chaudhary et al., “Domain-specific cognitive impairment in Parkinson’s patients with mild cognitive impairment,” J. Clin. Neurosci., vol. 75, pp. 99-105, 2020, doi: 10.1016/j.jocn.2020.03.015.
- A. R. Javed et al., “Artificial intelligence for cognitive health assessment: State-of-the-art, open challenges and future directions,” Cognit. Comput., vol. 42, pp. 1-46, 2023, doi: 10.1007/s12559-023-10153-4.
- Z. Zheng, P. Zheng and X. Zou, “Peripheral blood S100B levels in autism spectrum disorder: A systematic review and meta-analysis,” J. Autism Dev. Disord., vol. 51, pp. 2569-2577, 2021, doi: 10.1007/s10803-020-04710-1.
- P. P. Michel, E. C. Hirsch and S. Hunot, “Understanding dopaminergic cell death pathways in Parkinson disease,” Neuron, vol. 90, pp. 675-691, 2016, doi: 10.1016/j.neuron.2016.03.038.
- S. Kl√∂ppel et al., “Separating symptomatic Alzheimer’s disease from depression based on structural MRI,” J. Alzheimer’s Dis., vol. 63, pp. 353-363, 2018, doi: 10.3233/JAD-170964.
- A. H. Shoeb, “Application of machine learning to epileptic seizure onset detection and treatment,” Ph.D. dissertation, Massachusetts Institute of Technology, 2009.
- S. D’Mello, R. W. Picard and A. Graesser, “Toward an affect-sensitive AutoTutor,” IEEE Intell. Syst., vol. 22, pp. 53-61, 2007, doi: 10.1109/MIS.2007.79.
- A. McStay, “Empathic media and advertising: Industry, policy, legal and citizen perspectives (the case for intimacy),” Big Data Soc., vol. 3, pp. 1-11, 2016, doi: 10.1177/2053951716666868.
- A. McStay, “Emotional AI, soft biometrics and the surveillance of emotional life: An unusual consensus on privacy,” Big Data Soc., vol. 7, no. 1, 2020, doi: 10.1177/2053951720904386.
- P. M. Schulte-Frankenfeld and F. M. Trautwein, “App-based mindfulness meditation reduces perceived stress and improves self-regulation in working university students: A randomised controlled trial,” Appl. Psychol. Health Well- Being, vol. 14, pp. 1151-1171, 2022, doi: 10.1111/aphw.12328.
- L. Hides et al., “Efficacy and outcomes of a music-based emotion regulation mobile app in distressed young people: Randomized controlled trial,” JMIR Mhealth Uhealth, vol. 7, p. e11482, 2019, doi: 10.2196/11482.
- N. A. Youssef and C. L. Rich, “Does acute treatment with sedatives/hypnotics for anxiety in depressed patients affect suicide risk? A literature review,” Ann. Clin. Psychiatry, vol. 20, pp. 157-169, 2008, doi: 10.1080/10401230802177698.
- N. Cummins et al., “Artificial intelligence to aid the detection of mood disorders,” in Artificial Intelligence in Precision Health, D. Barh, Ed., Cambridge, MA: Academic Press, 2020, pp. 231-255.
- S. Abdullah et al., “Automatic detection of social rhythms in bipolar disorder,” J. Am. Med. Inform. Assoc., vol. 23, pp. 538-543, 2016, doi: 10.1093/jamia/ocv200.
- A. Anzulewicz, K. Sobota and J. T. Delafield-Butt, “Toward the autism motor signature: Gesture patterns during smart tablet gameplay identify children with autism,” Sci. Rep., vol. 6, p. 31107, 2016, doi: 10.1038/srep31107.
- Taffoni et al., “Sensor-based technology in the study of motor skills in infants at risk for ASD,” in 4th IEEE RAS & EMBS Int. Conf. Biomed. Robot. Biomechatronics (BioRob), 2012, pp. 1879-1883.
- G. Bedi et al., “Automated analysis of free speech predicts psychosis onset in high-risk youths,” NPJ Schizophr., vol. 1, p. 15030, 2015, doi: 10.1038/npjschz.2015.30.
- C. M. Corcoran and G. A. Cecchi, “Using language processing and speech analysis for the identification of psychosis and other disorders,” Biol. Psychiatry Cogn. Neurosci. Neuroimaging, vol. 5, pp. 770-779, 2020, doi: 10.1016/j.bpsc.2020.06.004.
- M. McCradden, K. Hui and D. Z. Buchman, “Evidence, ethics and the promise of artificial intelligence in psychiatry,” J. Med. Ethics, vol. 49, pp. 573-579, 2023, doi: 10.1136/jme-2022-108447.
- C. A. Lovejoy, “Technology and mental health: The role of artificial intelligence,” Eur. Psychiatry, vol. 55, pp. 1-3, 2019, doi: 10.1016/j.eurpsy.2018.08.004.
- M. J. Rigby, “Ethical dimensions of using artificial intelligence in health care,” AMA J. Ethics, vol. 21, pp. 121-124, 2019, doi: 10.1001/amajethics.2019.121.
- C. G. Walsh et al., “Stigma, biomarkers and algorithmic bias: Recommendations for precision behavioral health with artificial intelligence,” JAMIA Open, vol. 3, pp. 9-15, 2020, doi: 10.1093/jamiaopen/ooz054.
- D. W. Joyce, A. Kormilitzin, K. A. Smith and A. Cipriani, “Explainable artificial intelligence for mental health through transparency and interpretability for understandability,” NPJ Digit. Med., vol. 6, p. 6, 2023, doi: 10.1038/s41746-023- 00751-9.
- I. Straw and C. Callison-Burch, “Artificial intelligence in mental health and the biases of language-based models,” PLoS One, vol. 15, p. e0240376, 2020, doi: 10.1371/journal.pone.0240376.
- E. E. Lee et al., “Artificial intelligence for mental health care: Clinical applications, barriers, facilitators and artificial wisdom,” Biol. Psychiatry Cogn. Neurosci. Neuroimaging, vol. 6, pp. 856-864, 2021, doi: 10.1016/j.bpsc.2021.02.001.
- N. Koutsouleris, T. U. Hauser, V. Skvortsova and M. De Choudhury, “From promise to practice: Towards the realization of AI-informed mental health care,” Lancet Digit. Health, vol. 4, pp. e829-e840, 2022, doi: 10.1016/S2589- 7500(22)00153-4.
- S. G. Finlayson et al., “The clinician and dataset shift in artificial intelligence,” N. Engl. J. Med., vol. 385, pp. 283-286, 2021, doi: 10.1056/NEJMc2104626.
- S. Carr, “AI gone mental: Engagement and ethics in data-driven technology for mental health,” J. Ment. Health, vol. 29, pp. 125-130, 2020, doi: 10.1080/09638237.2020.1714011.
- L. Balcombe and D. De Leo, “Digital mental health challenges and the horizon ahead for solutions,” JMIR Ment. Health, vol. 8, p. e26811, 2021, doi: 10.2196/26811.
- A. S. Miner et al., “Key considerations for incorporating conversational AI in psychotherapy,” Front. Psychiatry, vol. 10, p. 746, 2019, doi: 10.3389/fpsyt.2019.00746.
- M. De Choudhury, S. Dutta and J. Ma, “Measuring the impact of anxiety on online social interactions,” in Proc. Int. AAAI Conf. Web Soc. Media (ICWSM), 2018, pp. 1-9.

Current Trends in Signal Processing
| Volume | 15 |
| Issue | 03 |
| Received | 27/05/2025 |
| Accepted | 23/06/2025 |
| Published | 30/06/2025 |
| Publication Time | 34 Days |
Login
PlumX Metrics