Cybersecurity: 24/7/365

A pediatric practice signs up for an AI medical scribe tool. The office manager types patient names and symptoms into the tool to generate visit notes faster. Three months later, the practice discovers the AI vendor stores all inputs on servers in a country with no data protection laws — and the vendor never signed a Business Associate Agreement.
Every patient interaction entered into that tool is now an unprotected HIPAA breach. And the practice had no idea.
AI adoption in healthcare is accelerating. A 2025 AMA survey found that 66% of physicians see advantages in using AI for healthcare, and 46% of medical practices now use at least one AI-powered tool. But the compliance question has not kept pace with adoption. The short answer to "Is AI HIPAA compliant?" is: AI itself is not compliant or non-compliant. The way your practice uses AI determines compliance.
HIPAA was written in 1996, decades before generative AI existed. The law does not specifically address artificial intelligence, machine learning, or large language models. But HIPAA does regulate any technology that creates, receives, maintains, or transmits protected health information — and that includes AI tools when they touch patient data.
The rules that apply to AI are the same rules that apply to every other technology in your practice:
The proposed 2026 HIPAA Security Rule update would add a technology asset inventory requirement that explicitly includes AI systems. Your practice would need to document every AI tool that accesses PHI, map the data flows, and verify each vendor's security posture.
Category 1: AI that never touches PHI. Some AI tools operate without any patient data. AI-powered scheduling optimization that works on anonymized time slots. AI website chatbots that answer general questions about your services. AI tools that generate marketing content or manage your social media. These tools do not create HIPAA obligations because they do not involve PHI.
Category 2: AI that processes PHI under your control. AI medical scribes, AI receptionists, AI billing tools, and AI clinical decision support systems all process PHI. These vendors are business associates under HIPAA. You need a BAA, you need to verify their security practices, and you need to ensure they meet every HIPAA requirement for handling ePHI.
Category 3: Consumer AI tools used informally with PHI. This is where the risk explodes. Staff members typing patient information into ChatGPT, Gemini, or Claude. Uploading clinical images to a general-purpose AI for analysis. Copying lab results into a free AI summarization tool. These consumer tools are not designed for HIPAA compliance, do not sign BAAs, and may use your input data for training. Every interaction involving PHI is a potential breach.
A 2024 Bain & Company survey found that 75% of AI adoption in companies happens outside of IT oversight — what researchers call "shadow AI." In healthcare, shadow AI is shadow HIPAA liability.
Before adopting any AI tool that will process patient data, verify these requirements:
1. The vendor signs a BAA. Non-negotiable. If the vendor will not sign a Business Associate Agreement, do not use the tool with PHI. Some consumer AI companies explicitly state in their terms that they are not HIPAA compliant and will not sign BAAs. OpenAI offers a HIPAA-eligible API tier with a BAA, but the free ChatGPT consumer product does not. Google offers a BAA for Google Cloud AI services, but not for the free Gemini consumer product. Know the difference.
2. Data is encrypted at rest and in transit. The AI vendor must encrypt all PHI using AES-256 (or equivalent) at rest and TLS 1.2+ in transit. Ask specifically about how data moves between your practice, the AI inference servers, and any storage systems.
3. Your data is not used for model training. Many AI companies use customer input data to improve their models. This means your patient's symptoms, diagnoses, and treatment details could become part of the AI's training dataset — accessible indirectly through future outputs. The BAA and service agreement must explicitly state that PHI will not be used for model training, fine-tuning, or any purpose beyond providing the contracted service.
4. Access controls and audit logs exist. The AI system must support role-based access controls, multi-factor authentication, and detailed audit logging. You need to know who accessed what patient data, when, and what the AI did with it.
5. Data residency is defined. Where is your patient data stored and processed? HIPAA does not prohibit cloud storage or international data processing, but your risk assessment must account for the jurisdictions involved. Many healthcare practices require U.S.-only data residency in their BAAs.
6. The vendor has relevant certifications. Look for SOC 2 Type II, HITRUST CSF, or ISO 27001 certifications. These third-party audits verify that the vendor's security practices meet established standards. A SOC 2 Type II report covers a 12-month audit period — not just a point-in-time assessment.
7. Clear data retention and deletion policies. How long does the AI vendor retain your patient data? Can you request deletion? When the contract ends, is all PHI destroyed? The BAA should address all of these scenarios.
Here is the HIPAA readiness of common AI categories used by small practices:
AI medical scribes (Nuance DAX, Abridge, DeepScribe): The major medical scribe vendors offer HIPAA-compliant configurations with signed BAAs. Nuance DAX (now part of Microsoft) operates under Microsoft's HIPAA framework. Abridge and DeepScribe both sign BAAs and provide SOC 2 Type II reports. These are purpose-built for healthcare and generally safe — but verify the specific product tier you are purchasing.
AI receptionists and phone systems (AI phone systems): HIPAA-compliant AI receptionist services exist and can sign BAAs. They handle appointment scheduling, call routing, and after-hours answering with PHI protections built in. Verify that call recordings and transcripts are encrypted and that the vendor's data processing agreements explicitly cover voice data as PHI.
AI billing and coding tools: Vendors like Waystar, Olive AI, and others offer HIPAA-compliant AI billing solutions with BAAs. These tools process claims data containing PHI and must meet the same standards as any business associate. Check whether the AI component processes data on the vendor's servers or on-premises.
General-purpose AI assistants (ChatGPT, Gemini, Claude): The consumer versions of these products are not HIPAA compliant. However, enterprise and API versions may be. OpenAI's API has a HIPAA-eligible tier. Anthropic offers a HIPAA-eligible API. Google Cloud offers a BAA for its AI services. The key distinction: the free consumer product versus the paid enterprise/API product with a BAA.
Shadow AI is the biggest AI-related HIPAA risk for small practices. It happens when staff use unauthorized AI tools because they are convenient:
Each of these shares PHI with an unauthorized third party — a HIPAA violation regardless of intent. And because consumer AI tools are free and require no IT setup, staff can use them without anyone knowing.
Address shadow AI with three steps:
In December 2023, HHS published guidance on HIPAA and AI, clarifying several points:
HHS also signaled that additional AI-specific guidance is forthcoming. The agency is watching how healthcare organizations adopt AI and will likely issue more detailed rules as the technology matures.
Before signing up for any AI tool that will touch patient data, ask these questions:
If the vendor cannot answer all 10 questions clearly, they are not ready for healthcare data.
AI can transform your practice — saving hours on administrative tasks, reducing no-shows, and improving patient communication. But every AI tool that touches PHI must meet the same HIPAA standards as your EHR, your email provider, and your billing company.
The question is not "Is AI HIPAA compliant?" The question is: "Is this specific AI tool, configured this way, with this BAA, compliant for the way my practice plans to use it?" The answer depends on the vendor, the product tier, the contract terms, and your implementation.
Start with your current AI inventory. Identify every AI tool your staff uses — including the ones IT did not approve. Then verify BAAs, confirm data handling practices, and create a policy that keeps your practice protected as you adopt new tools.
Book a free IT assessment to audit your AI tools for HIPAA compliance. We will map your AI usage, identify shadow AI risks, verify vendor agreements, and help you adopt AI safely. Explore our HIPAA compliance and managed IT services to see how we keep practices compliant while they grow.