PAPER: The Digital Divide: How AI Accessibility May Displace Professional Care for Marginalized Populations
The rapid integration of artificial intelligence (AI) into mental health care is often hailed as a solution to the global shortage of therapeutic resources. As mental disorders continue to surge and demand exceeds the capacity of practitioners, AI-enabled technologies offer a promising avenue for bridging the gap (Cecil et al., 2025). However, a closer examination of recent research reveals a concerning paradox: the very accessibility of AI may inadvertently encourage marginalized populations to substitute professional therapy with algorithmic alternatives, potentially exposing them to inferior care and negative mental health outcomes.
The Allure of Accessibility for Vulnerable Groups For marginalized populations, including youth students, racial minorities, low income populations, rural areas, and groups with specific cultural needs such as refugees and undocumented immigrants, traditional mental health services often fall short. Barriers such as stigma, cost, and a lack of culturally competent care can make professional therapy difficult to access. In this context, generative AI (GenAI) chatbots present an attractive alternative. They offer immediate, real-time responses and are perceived as offering "non-judgmental" feedback, which can provide a sense of security for those fearing shame or judgment (Chan, 2025).
Research indicates that youth students, a demographic often grappling with mental health challenges, value AI for its accessibility and the anonymity it provides (Chan, 2025). The availability of these tools allows users to bypass the logistical and financial hurdles of professional care. When considering that racial minorities have systemic and historical mistrust of the healthcare field, low income populations don’t have the means to pay for care, people in rural areas may not have a nearby provider, and youth may lack the maturity or knowledge to know where to find good help, it is easy to understand how people would turn to AI instead of a human healthcare provider. However, this convenience carries a significant risk: vulnerable individuals may opt for the "path of least resistance," relying on AI as a primary therapist rather than a supplement to professional care (Chan, 2025).
The Illusion of Empathy and the Quality Deficit While AI may be accessible, the quality of care it provides differs fundamentally from that of a human therapist. A 2025 study comparing AI chatbots to human psychotherapists found that while AI scored significantly higher in "perceived" empathy and perspective-taking, it lacked the nuance of human interaction (Yonatan & Brukner, 2025). AI responses were predominantly "supportive," whereas human therapists were significantly more likely to use complex and necessary intervention strategies, such as exploring dysfunctional patterns, challenging harmful ideas, and asking clarifying questions (Yonatan & Brukner, 2025). A human therapist is not just a sounding board or an unending source of confirmation for your thoughts and actions. They are trained to be both supportive when needed and corrective when needed. They offer a balanced, educated perspective that is only possible to obtain from a professional. This is why they are so valuable and can offer such impactful transformations with their clients.
For marginalized individuals dealing with complex challenges that require layered solutions, AI responses are wildly insufficient and generally ill informed. Seeing as how marginalized communities already receive substantially less healthcare services than they should, a bad experience consulting AI could potentially worsen their desire to seek help in the future and lead to disastrous consequences. The "Problem of a Narrowly Intelligent Therapist," as identified in student perceptions, highlights that AI is often not equipped to handle the adaptable and holistic nature of human psychotherapy (Chan, 2025). Students described AI communications as "cold words" and expressed significant concerns regarding the lack of genuine human connection, which is essential for a therapeutic alliance (Chan, 2025). Relying on AI means missing out on the "nuanced communication available to human therapists through speech, facial expressions, and body language" (Chan, 2025).
Negative Effects on Mental Health Outcomes
The substitution of professional care with AI can lead to tangible negative effects on overall mental health.
Superficial Treatment of Complex Issues: Human therapists are trained to foster insight and explore maladaptive patterns to promote deep change (Yonatan & Brukner, 2025). They are also trained to diagnose chronic mental health conditions that can have severe negative impacts on quality of life when not properly treated. In contrast, AI’s tendency toward basic support may leave deep-seated psychological issues unaddressed.
Risk of Harmful Interactions: There is a danger that AI's inability to adapt to emotional cues could exacerbate distress. Students have noted that if an AI uses "harmful words," it could make a user "even more anxious," potentially worsening their condition rather than solving it (Chan, 2025).
Safety Concerns: Clinicians prioritize the ability to predict and monitor high-risk behaviors, such as suicidality, and they offer resources tailored to the individual’s needs and qualifications (Fischer et al., 2025). If marginalized individuals rely solely on commercial AI chatbots without professional oversight, they lose critical safety benefits of going to therapy. AI is not trained to make referrals to specialists or local programs based on the patient’s circumstances, execute life-saving crisis response, prescribe essential medication, or build comprehensive treatment plans that address the whole person (Fischer et al., 2025). Thus, it falls short in many crucial areas that can be “make or break” for people’s mental health.
Privacy Issues: It is also important to note that AI is not bound by confidentiality laws such as HIPAA. So, any confessions shared to AI could theoretically be shared publicly with little to no consequence to the software company, based on their User Agreements and Terms & Conditions. This is not the case when you seek treatment from professionals.
Conclusion
The integration of AI in mental health should aim to support, not replace, the human practitioner. While AI offers the ability to improve efficiency for clinicians and create a starting point for patients who may be looking into local treatment resources, relying on it as a substitute for professional care risks creating a two-tiered mental health system. Those with means may access human experts capable of providing quality care, and those without means being left with incompetent AI support. To prevent a decline in the mental health of vulnerable groups, it is crucial that AI remains a complement to, rather than a replacement for, the immensely intricate and intensely influential connection of human therapy.
SCIENTIFIC REFERENCES
Cecil, J., Kleine, A.-K., Lermer, E., & Gaube, S. (2025). Mental health practitioners’ perceptions and adoption intentions of AI-enabled technologies: an international mixed-methods study. BMC Health Services Research, 25(1), 1–18. https://doi.org/10.1186/s12913-025-12715-8
Chan, C. K. Y. (2025). AI as the Therapist: Student Insights on the Challenges of Using Generative AI for School Mental Health Frameworks. Behavioral Sciences (2076-328X), 15(3), 287. https://doi.org/10.3390/bs1503028
Fischer, L., Mann, P. A., Nguyen, M.-H. H., Becker, S., Khodadadi, S., Schulz, A., Edwin Thanarajah, S., Repple, J., Hahn, T., Reif, A., Salamikhanshan, A., Kittel-Schneider, S., Rief, W., Mulert, C., Hofmann, S. G., Dannlowski, U., Kircher, T., Bernhard, F. P., & Jamalabadi, H. (2025). AI for mental health: clinician expectations and priorities in computational psychiatry. BMC Psychiatry, 25(1), 1–8. https://doi.org/10.1186/s12888-025-06957-3
Yonatan, L. R., & Brukner, H. (2025). Comparing perceived empathy and intervention strategies of an AI chatbot and human psychotherapists in online mental health support. Counselling & Psychotherapy Research, 25(1), 1–9. https://doi.org/10.1002/capr.12832




.png)
.png)
Comments
Post a Comment