Clinical Work in an AI World: A Practitioner’s View
- Lynn Doris, MBA, LCSW, M-CASAC
- 25 minutes ago
- 2 min read
The age of Artificial Intelligence (AI) is upon us, and by all accounts, its impact is — and will continue to be — astounding. Revolutionary may not be too strong a word. We are already seeing major changes across the workplace (automation), healthcare (diagnostics), education (learning platforms), creativity (AI-generated art), scientific discovery (accelerated research), and more. The political, economic, and ethical implications could fill volumes — and they have.
But what about addiction treatment and behavioral health?

For one, AI is unlikely to pose a major threat to job elimination in the clinical arena. While reductions are occurring in fields like accounting, data processing, manufacturing, and transportation, addiction counselors, social workers, and behavioral health practitioners remain largely insulated. The reason is simple: complexity and human connection.
Clinical work is defined by the bond between a skilled, compassionate professional and a patient struggling with the dimensions of addiction or mental illness. It requires listening, empathizing, and responding with both heart and intellect. Clinicians create a space of psychological safety where patients can confront deep struggles — something no computer can replicate.
While clinical roles appear safe, significant changes in how we do the work should be expected — both positive and negative.
Opportunities: How AI Can Enhance Clinical Work
Here are a few AI-based tools that are already improving the field — many of which I’ve personally used with success:
Improved Documentation: Clinicians can enter progress notes, narratives, or letters (with no patient-identifying information) into tools like ChatGPT and receive polished, professional text in seconds.
Treatment Planning: AI tools trained on clinical sources (DSM-5-TR, ICD-10, SAMHSA, NIDA, ASAM, NAADAC, APA, NASW) can support clinicians in formulating and tailoring treatment plans — while always requiring human judgment.
Supervision & Training: AI can instantly generate PowerPoint slides, handouts, and training materials on clinical topics — a time-saving tool for educators and supervisors.
Translation: Recently, in preparing materials for our new Spanish-speaking group, I used ChatGPT to translate intake forms — which our Spanish-speaking counselor then verified as accurate.
AI offers countless time-saving and quality-enhancing opportunities in clinical and administrative work.
Risks & Cautions
However, there are risks to navigate carefully:
Data Privacy: Patient-identifying information must never be submitted to AI tools. Federal confidentiality and HIPAA compliance remain paramount.
Over-reliance: Clinicians must avoid replacing nuance and human connection with AI-generated recommendations. The quality of treatment depends on authentic relationships.
Accuracy & Bias: AI tools are not infallible. Their models may be flawed, incomplete, or outdated — accuracy must always be verified.
Technology has brought enormous advantages — ask any of us who remember carrying mountains of patient charts pre-electronic records! Still, we must stay vigilant in protecting what matters most: the human connection at the heart of healing.
To close, here’s a little humor: Full disclosure — I used ChatGPT to check a few items in this article. I could have entered the topic and a few ideas and instantly received a complete article — probably a better one than this.
But it wouldn’t have been mine. 😊
Comments