Published on
Feb 7, 2026
Can Therapists Use AI for Progress Notes Ethically and Safely?
As artificial intelligence becomes more common in healthcare software, many therapists are asking a cautious but necessary question: Is it ethical—and safe—to use AI for therapy progress notes?
The short answer is yes, potentially—but only when AI is used in a clearly limited, transparent, and privacy-preserving way. The longer answer depends on how the technology is designed and where clinical data is processed.
Why Therapists Are Right to Be Cautious About AI
Therapy documentation is not just administrative work. Progress notes contain sensitive, identifiable, and often deeply personal information. Ethical frameworks across psychology, counselling, and social work consistently emphasize:
Client confidentiality
Professional responsibility for records
Clear limits on third-party access to data
Informed use of technology
Introducing AI into this workflow without careful consideration can raise concerns about data exposure, loss of control, and blurred lines around clinical responsibility.
Professional organizations have echoed this caution:
American Psychological Association – Ethical Principles of Psychologists and Code of Conduct:
https://www.apa.org/ethics/code
Canadian Psychological Association – Ethical Use of Technology
https://cpa.ca/docs/File/Ethics/CPAe-therapyGuidelinesUpdate2020.pdf
The Key Ethical Distinction: Assistance vs. Judgment
One of the most important distinctions in ethical AI use is the difference between documentation support and clinical judgment.
Ethical AI tools:
Assist with drafting, formatting, or organizing notes
Reduce repetitive administrative work
Leave all interpretive and diagnostic decisions to the clinician
Unethical or high-risk tools:
Analyze client psychology or “detect” progress
Suggest interventions or treatment decisions
Replace clinician oversight with automated outputs
Any AI system used in therapy should clearly state what it does not do.
Cloud-Based AI and Ethical Complexity
Many AI documentation tools rely on cloud infrastructure. While some advertise compliance with healthcare regulations, cloud-based systems still introduce ethical complexity:
Session data must leave the clinician’s device
Third parties may process or store information
Ongoing privacy depends on vendor policies and contracts
Breaches or misuse can still occur despite safeguards
For therapists, this means ethical responsibility remains even when data handling is outsourced.
The U.S. Department of Health & Human Services notes that covered entities retain responsibility when using cloud services:
https://www.hhs.gov/hipaa/for-professionals/special-topics/cloud-computing/index.html
Local, On-Device AI as an Ethical Alternative
Local AI systems approach the problem differently. Instead of transmitting data to external servers, all processing happens directly on the clinician’s device.
From an ethics standpoint, local-only AI offers several advantages:
Client data never leaves the clinician’s computer
No third-party access to session content
Reduced exposure to breaches or misuse
Clear alignment with confidentiality principles
Easier justification during audits or supervision
Local processing does not eliminate professional responsibility—but it does reduce unnecessary risk.
What Ethical AI Documentation Should Look Like
When evaluating AI tools for progress notes, therapists may want to ask:
Does the tool store or transmit client data externally?
Does it clearly limit itself to documentation support?
Can I use it offline?
Who has access to my data?
Can I explain its use to a supervisor, client, or licensing body?
If the answers are unclear, that ambiguity itself may be a red flag.
A Privacy-First Example
Some newer tools are being built with these ethical considerations in mind. SessionWise, for example, is a macOS application designed to support therapy documentation using on-device AI only. There are no servers, no user accounts, and no data collection. Notes, transcripts, and drafts remain stored locally on the clinician’s Mac, and AI assistance is limited to drafting and organization rather than interpretation or judgment.
For therapists who want help with documentation while maintaining clear ethical boundaries, local-first tools like this represent a more conservative and defensible approach.
Learn more at https://sessionwise.app

