AI In Mental Health: What Does it Mean?

A microscope, brain, textbook and stethoscope in a modernist design reflecting AI technology power. Image: GatesNotes

Within the last few months, there has been an exponential increase in the discussion of AI as programs such as DALL·E 2 and ChatGPT, as they have skyrocketed in popularity. 

First came the concerns regarding the tools in the classrooms. Would students still be able to be productive if a robot is doing all of their work for them? How would a teacher be able to differentiate between their work and a programmed bot? Concerns grew even more when the latter of the two aforementioned programs passed multiple exams from business and law schools

As the initial scare factor died down because of supposed “AI detectors” which were made available to the public, researchers devoted more of their time into seeing how far this machinery could go with accurate information. 

As early as November 2022, one Forbes article argued that the implementation of AI technology in healthcare was “critical” in improving mental health and wellness. 

The author of the article, Cindy Gordon, went on to explain the possible benefits of using this advanced technology. “AI can be used to not only detect depression risks, but also to treat depression with tools to manage depression symptoms by collecting feedback, providing personalized recommendations, delivering the right content, being a surrogate companion, and also ensuring employee voices are being heard and that their insights can help solve systemic work conditions.”

Gordon went on to propose the idea of creating an AI-based app which would track the user’s personal health and moods over the course of the time it’s been installed. Gordon followed this by suggesting that the app could provide recommendations on “how to improve an organizational work practices” and point out inappropriate behavior patterns. This, in turn, could greatly improve work environments and see a rise in productivity, equity, and inclusion. 

Gordon concluded her article by reiterating that the use of AI technology could potentially be very helpful for those with mental illnesses, as the technology would be improved upon constantly, the information it would hold would be private and secure, the information the user would request would be given within a timely manner, and it could help those who do not have as much accessibility to counseling services. 

However, this doesn’t mean that there aren’t its risks. In a more recent publication from Axios in March, 2023, author Sabrina Moreno mentioned that the use of AI technology has healthcare researchers and providers in a state of fear. 

This is primarily because of how new the technology is. The previously mentioned researchers and providers are nervous over potential glitched algorithms and misuse, with many people turning to amateur programs like ChatGPT for counseling. This occurs despite the platform warning users it’s not meant to be used in that fashion. 


Many organizations and healthcare providers are now investing their time and some resources into exploring AI technology. Lawmakers have also begun to take a look into AI-driven technologies in order to find a way to regulate it. Researcher Stephen Scheuller, who was interviewed for Axios, mentioned that people are “going to need more than the FDA” now that AI has been introduced to the masses. 

As AI technology continues to be developed, more and more potential risks and benefits are brought into question. 

Leave a comment