The Ethical and Responsible Use of AI in Sessions Health

The use of artificial intelligence (AI) in healthcare, particularly in mental health, offers incredible promise, but also requires incredible responsibility. From streamlining administrative tasks to assisting with clinical documentation, AI has the potential to give therapists and practice owners more time and energy to focus on their clients and themselves.

But with that potential comes responsibility.  As AI becomes more embedded in clinical workflows, important questions emerge around privacy, data stewardship, transparency, and environmental impact.


Our approach to ethical AI

At Sessions Health, we believe the power of AI must always be matched with a deep commitment to ethics, transparency, and trust.

Our approach is grounded in a simple principle: AI should serve clinicians while protecting the privacy of the clients they care for. That means setting clear boundaries around how data is used, prioritizing transparency over convenience, and rejecting practices that treat sensitive health information as a commodity.

Trust is at the foundation of everything we do, and AI is no exception.


Why we build AI features

At Sessions Health, our mission is to help mental health professionals spend less time on administrative work and more time with their clients.

When thoughtfully designed, AI can play a meaningful role in that mission. It can reduce documentation burden, support accuracy, and give clinicians back their time. Our goal is to help clinicians sustain long, healthy careers without burnout.

While many stand-alone AI tools exist, they often require duplicate data entry or copying and pasting sensitive information between systems. By integrating AI directly into Sessions Health, we’re able to streamline that process, securely leveraging the information already in your account to reduce friction and save time.

Your trust remains essential. That’s why every AI feature we develop is guided by the same principles that define our platform: ethics, transparency, inclusivity, and a deep commitment to protecting client privacy.


Prioritizing Privacy with HIPAA-Compliant AI

Mental health professionals are entrusted with some of the most sensitive information a person can share. That trust must extend to the technology they use. That’s why our AI-powered features are built with privacy at the foundation.

Today, many clinicians are unknowingly putting themselves at risk by entering client information into AI tools that are not covered by a Business Associate Agreement (BAA). Without that protection, well intentioned use can lead to HIPAA violations, civil penalties, licensure issues, and potential legal exposure.

Sessions Health removes that risk so you remain HIPAA-compliant. But we also go further to ensure your data is handled with the highest level of care:

  • Our AI models do not retain or learn from your data. You can be sure your data is used ethically and privately, and does not support a broader AI processor or company-built repository. Your data will never be used to train models or contribute to a broader dataset.
  • Although legally permitted, our tools mask sensitive identifiers before being processed. This serves as an extra measure to protect sensitive PII (Personally Identifiable Information) and PHI (Protected Health Information) before reaching any AI processor.
  • We destroy all session recordings immediately after a transcript is generated.
  • We destroy all transcripts immediately after the note is signed, or up to 7 days if the note is not signed within 7 days.

Sessions Health are stewards of your data, not owners. We will never use your data, whether or not you use our AI features, for training, resale, or any commercial reuse.


The Problem with Some AI Products and EHRs

Not all AI solutions have protection in mind. Some EHR platforms include terms that allow them to use clinician notes and client data for their own purposes, and may include the ability to:

  • Use clinician-authored notes to train future AI models
  • Retain sensitive session data longer than necessary or indefinitely
  • Combine user data for commercial AI development
  • Share or sell your data to third parties for additional monetization

These practices raise ethical concerns, introduce risk, and undermine the trust between clinician and client.

To be clear, some EHRs do a strong job of protecting clinician and client data. But the differences often live in the fine print and are easy to miss.

At Sessions Health, your data is not a product, and our role is to protect it, not profit from it.


Flexible, clinician-centered AI Assist

Our AI Notes feature isn’t a black box that takes over your work. It's a helpful assistant that works the way you work. We designed it to be flexible, customizable, and fully supportive of the individual needs of each practice.

AI Assist helps throughout your sessions:

Pre-session summaries: Helps you quickly prepare for upcoming appointments by generating a concise overview of recent client context.

Multiple ways to generate notes:

  • Type a written summary of your session
  • Transcribe a live telehealth session directly within Sessions Health
  • (Coming soon) Transcribe in-person sessions
  • (Coming soon) Upload recordings from in-person or telehealth sessions

Once your input is provided, AI Assist generates a progress note in your preferred format:

  • SOAP (Subjective, Objective, Assessment, Plan)
  • DAP (Data, Assessment, Plan)
  • BIRP (Behavior, Intervention, Response, Plan)
  • Narrative notes

You stay in control of how your notes sound and what they include. Customize:

  • How you’re referenced (e.g., therapist, clinician, doctor)
  • How your client is referenced (e.g., client, patient)
  • Whether to include pronouns
  • Level of detail (concise, intermediate, detailed)
  • Therapy modalities used (CBT, DBT, EMDR, etc.)
  • Additional custom instructions

AI Assist also learns from your edits, adapting over time to better reflect your clinical voice, documentation style, and therapeutic approach.

The result? Faster documentation that reflects your clinical voice, documentation standards, and therapeutic approach.


Ethical AI also means environmental responsibility

AI has a real-world environmental footprint. The computing power required to run AI systems consumes energy, contributes to carbon emissions, and places demand on water resources. While the impact of a single AI-generated note may be tiny, the cumulative effect across millions of interactions and across hundreds of platforms adds up quickly. At Sessions Health, we believe responsible AI includes accountability for these impacts.

As part of our commitment, we actively measure the energy usage associated with AI across our platform and have partnered with ClimeCo to offset our carbon footprint. While fully quantifying and reducing AI’s environmental impact is complex and evolving, that complexity isn’t an excuse for inaction.

We believe technology companies have a responsibility to take meaningful, measurable steps toward sustainability. Clinicians and practice owners concerned about AI’s environmental impact can feel confident knowing our AI features are designed with sustainability in mind and guided by a commitment to a more responsible future.


Please see our FAQs for more details about AI Assist. Do you have any thoughts or suggestions on how Sessions Health can expand the ethical use of AI? We'd love to hear it! You can email us at support@sessionshealth.com.

Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.

Still need help? Contact Us Contact Us