GLACIS·EU AI Act series·EU AI Act vs HIPAA·Updated April 2026

EU AI Act vs HIPAA.

A side-by-side reference for healthcare and life-sciences operators with obligations on both sides of the Atlantic. The two regimes were drafted for different purposes — PHI protection (HIPAA) and AI-system safety and rights (EU AI Act) — and neither substitutes for the other. This crosswalk reads them in parallel across scope, trigger, evidence, penalties, enforcement, and AI-specific provisions.

Book the Agent Runtime Security Sprint Series hub →
Healthcare AI Life sciences CMIO Privacy counsel
Scope
HIPAA: PHI handling in the US · EU AI Act: AI systems used in the EU
Penalty ceiling
HIPAA: $63,973 per violation, $1.92M annual · EU: €35M or 7% turnover
Enforcement
HHS OCR · National competent authorities + AI Office
Apr 2026
Active OCR caseload (HIPAA) · No public AI-Act enforcement yet
What changed in April 2026

The Digital Omnibus on AI is in trilogue with proposed delays to 2 December 2027 (stand-alone) and 2 August 2028 (embedded). MDR/IVDR-regulated medical AI typically falls under the embedded category, so the practical EU compliance window for healthcare AI may extend to 2 August 2028 if the Omnibus is adopted. Until adoption, prudent dual-jurisdiction programmes continue planning to the original 2 August 2026 baseline.

HHS OCR has continued enforcement against healthcare AI use cases under existing HIPAA authority — most notably privacy and tracking-technology guidance. The Sharp HealthCare ambient-scribe class action filed November 2025 is still active. Organisations running dual programmes are now treating the Article 12 EU AI Act log and the HIPAA audit log as one evidence trail rather than two.

By Joe Braidwood·14 min read·Updated April 24, 2026

Executive summary

HIPAA (the Health Insurance Portability and Accountability Act, 1996; HITECH 2009) governs privacy and security of Protected Health Information in the United States. The EU AI Act (Regulation 2024/1689) governs AI-system safety, transparency, and fundamental rights across the 27 EU member states. The two regimes are complementary, not substitutive — HIPAA compliance does not satisfy EU AI Act requirements, and the reverse.

For an organisation running an AI system that touches both PHI and EU data subjects, both regimes apply simultaneously. The practical opportunity is that they share evidence shape: cryptographically attested logs, control-execution receipts, and post-incident artefacts that satisfy HIPAA’s audit-controls requirement under 45 CFR §164.312(b) and Article 12 of the AI Act in the same record.

This crosswalk reads the two regimes in parallel across ten dimensions, identifies what HIPAA does not cover (AI-specific obligations) and what the AI Act does not cover (US PHI-handling), and shows how GLACIS produces a single evidence trail that satisfies both audit reviews.

In this crosswalk

Why this crosswalk matters now: Sharp HealthCare (Nov 2025)

A proposed class action was filed November 2025 against Sharp HealthCare alleging that an ambient AI scribe recorded an estimated 100,000+ patients without proper consent and that false consent statements appeared in medical records, citing California privacy law and HIPAA-adjacent consent requirements. Source

Healthcare AI operators with dual obligations face compounding liability — neither HIPAA documentation nor AI-Act documentation alone produces the evidence the class-action discovery process will demand.

The relationship between HIPAA and the EU AI Act

Understanding how HIPAA and the EU AI Act relate requires recognizing their fundamentally different origins and objectives. These frameworks emerged from different regulatory traditions to address different risks—yet both apply to healthcare AI systems operating across the Atlantic.

HIPAA: protecting health information

HIPAA, enacted in 1996 and substantially updated through the HITECH Act (2009), focuses on protecting the privacy and security of individually identifiable health information. Its core concern is preventing unauthorized access, use, and disclosure of Protected Health Information (PHI). HIPAA applies to “covered entities” (healthcare providers, health plans, clearinghouses) and their “business associates” who handle PHI on their behalf.

For AI systems, HIPAA asks: Is PHI adequately protected from unauthorized access, modification, or disclosure?

EU AI Act: ensuring AI safety and rights

The EU AI Act, enacted in 2024, focuses on ensuring AI systems are safe, respect fundamental rights, and operate transparently. Its core concern is preventing AI systems from causing harm to health, safety, or fundamental rights. The AI Act applies to providers and deployers of AI systems based on the risk level of the AI application, regardless of what data the AI processes.

For AI systems, the EU AI Act asks: Is this AI system designed and operated to prevent harm to individuals and society?

Complementary, not overlapping

These frameworks operate in different dimensions:

  • HIPAA is sector-specific: It applies only to healthcare and only to health information, but it applies regardless of what technology is used.
  • EU AI Act is technology-specific: It applies only to AI systems, but it applies across all sectors and regardless of what data is processed.

A healthcare AI system processing PHI for EU patients must comply with both frameworks. HIPAA governs how the system handles patient data; the EU AI Act governs how the AI system behaves and is governed. Neither substitutes for the other.

Side-by-side: ten dimensions

Dimension EU AI Act HIPAA
Primary focus Safety, transparency, and fundamental rights of AI systems Privacy and security of Protected Health Information (PHI)
Jurisdiction European Union (27 member states); extraterritorial — output used in the EU triggers obligations even where provider and deployer are outside the EU United States; covered entities and business associates handling PHI
Scope trigger Provision or deployment of an AI system in the EU; provider/deployer status determined by Articles 3, 25 and 26 Handling of PHI by a covered entity or business associate; sector-specific (healthcare)
Risk classification Four-tier system: prohibited (Article 5), high-risk (Annex I or III), limited-risk (Article 50 transparency), minimal-risk No formal tiers; all PHI must be protected; safe harbour for de-identified data under §164.514
Enforcement bodies National competent authorities in each member state plus the EU AI Office for GPAI; market surveillance under Article 74 HHS Office for Civil Rights (OCR); state attorneys general; civil-monetary-penalty tiers by culpability
Penalty ceiling €35M or 7% of global turnover (prohibited practices); €15M or 3% (other non-compliance); €7.5M or 1% (incorrect information) $63,973 per violation; $1,919,173 annual cap per identical provision (2025 figures, indexed)
Evidence requirement Article 11 technical documentation per Annex IV; Article 12 automatic logs; Article 9 risk-management records; Article 14 oversight traces 45 CFR §164.312(b) audit controls; §164.308(a)(1)(ii)(D) information system activity review; risk analysis under §164.308(a)(1)(ii)(A)
Conformity assessment Article 43: internal control (most Annex III systems) or notified-body assessment (biometric ID, certain medical AI); EU declaration of conformity Self-attestation; no third-party certification required; OCR audits and breach investigations are reactive
Documentation retention 10 years after AI system placed on market or put into service (Article 18); logs retained "appropriate to intended purpose" 6 years from creation or last effective date (whichever is later) under §164.530(j)
Breach / incident notification 15 days for serious incidents to competent authority (Article 73); shorter for fundamental-rights breaches 60 days to affected individuals; annual to HHS for breaches under 500; 60 days to HHS and media for breaches of 500+

Key differences

While both frameworks aim to protect individuals, their approaches differ substantially in several critical areas.

1. Focus: data vs system

HIPAA: Data-Centric

HIPAA’s requirements center on PHI—the 18 identifiers that make health information individually identifiable. The same AI system processing de-identified data faces no HIPAA requirements, while processing PHI triggers full compliance obligations. The technology is agnostic; the data determines applicability.

EU AI Act: System-Centric

The EU AI Act’s requirements center on the AI system itself—its intended purpose, deployment context, and potential for harm. An AI diagnostic tool is high-risk regardless of whether it processes personal data, anonymous data, or synthetic data. The system’s function determines applicability.

2. Sector specificity vs technology specificity

HIPAA is sector-specific: It applies only within healthcare contexts. An AI system doing sentiment analysis on customer service calls faces no HIPAA requirements—unless those calls involve healthcare providers and patient health information.

The EU AI Act is technology-specific: It applies whenever AI technology is used, across all sectors. The same AI diagnostic tool faces high-risk requirements whether deployed by a hospital, an insurance company, or a pharmaceutical research lab.

3. Enforcement mechanisms

Enforcement Comparison

HIPAA

Enforced by HHS OCR through complaint investigations and compliance audits. Enforcement has historically focused on breach response and egregious violations. Civil monetary penalties are tiered by culpability level (unknowing, reasonable cause, willful neglect). State attorneys general can also enforce.

EU AI Act

Enforced by national market surveillance authorities in each EU member state, with coordination by the EU AI Office. Enforcement includes product market access (CE marking required), operational restrictions, and administrative fines. The AI Office oversees General Purpose AI model compliance directly.

4. Extraterritorial application

HIPAA applies to covered entities and business associates in the United States, plus foreign entities that handle PHI on behalf of US covered entities through BAAs. The trigger is the business relationship with US healthcare entities.

The EU AI Act has explicit extraterritorial reach similar to GDPR. It applies to any provider placing AI systems on the EU market or putting them into service in the EU, regardless of where the provider is located. It also applies when AI output is used within the EU, even if both provider and deployer are outside the EU.

Detailed control mapping

Despite their different focuses, HIPAA and the EU AI Act share some underlying control requirements. Organizations can leverage these overlaps to build efficient, unified governance.

Privacy and data governance

Privacy Controls Mapping

Control Area HIPAA Requirement EU AI Act Requirement Synergy
Data Minimization Minimum Necessary (45 CFR 164.502(b)) Article 10(3) – training data limited to what is necessary High
Data Quality 45 CFR 164.530(c) – reasonable accuracy Article 10(2) – training data must be relevant, representative, free of errors High
Data Governance Policies and procedures for PHI handling Article 10 – comprehensive data governance for training, validation, testing Medium
Individual Rights Access, amendment, accounting of disclosures Article 86 – right to explanation for high-risk AI decisions Medium
Consent/Authorization Authorization required for non-permitted uses Transparency required; consent handled under GDPR Low

Security controls

Security Controls Mapping

Control Area HIPAA Security Rule EU AI Act Technical Requirements Synergy
Access Controls 164.312(a)(1) – unique user IDs, emergency access Article 9(4)(b) – access controls for authorized personnel High
Audit Logging 164.312(b) – activity logging for PHI access Article 12 – automatic logging of system operation Medium
Integrity Controls 164.312(c)(1) – protect ePHI from alteration Article 15 – accuracy, robustness, cybersecurity High
Transmission Security 164.312(e)(1) – encryption in transit Article 15(4) – cybersecurity appropriate to risks High
Risk Assessment 164.308(a)(1)(ii)(A) – security risk analysis Article 9 – risk management system Medium

Documentation requirements

Documentation Requirements Mapping

Document Type HIPAA Requirement EU AI Act Requirement
Policies & Procedures Required for all safeguards (164.530(i)) Required as part of QMS (Article 17)
Risk Documentation Risk analysis and management (164.308(a)(1)) Risk management system documentation (Article 9)
Technical Documentation System documentation for security controls Comprehensive technical documentation per Annex IV
Training Records Workforce training documentation (164.530(b)) AI literacy training records (Article 4)
Vendor Agreements Business Associate Agreements (164.308(b)) Contracts with deployers/downstream providers (Article 25)
Retention Period 6 years from creation or last effective date 10 years after system placed on market

Audit Trail Requirements

Both frameworks require audit trails, but with different focuses:

HIPAA Audit Controls

  • Who accessed PHI (user identification)
  • What PHI was accessed (records, data elements)
  • When access occurred (timestamps)
  • What action was taken (read, write, delete)
  • 6-year retention requirement

EU AI Act Article 12 Logging

  • Duration of each use (start/stop times)
  • Reference database used for input data
  • Input data that triggered search/match
  • Natural persons involved in verification
  • Logs for market surveillance inspection

Key insight: Healthcare AI systems need both types of logging. HIPAA logging tracks data access for privacy protection; EU AI Act logging tracks system behavior for safety and accountability. These are complementary, not duplicative.

Gap analysis: what HIPAA does not cover (EU AI Act-specific)

Organizations with mature HIPAA compliance programs will find significant gaps when applying EU AI Act requirements. These are requirements with no HIPAA equivalent.

EU AI Act Requirements Not Covered by HIPAA

1. Risk Classification and Prohibited AI

HIPAA has no concept of AI risk tiers or prohibited AI practices. The EU AI Act’s Article 5 bans certain AI uses outright (social scoring, manipulative AI, untargeted facial recognition). HIPAA-compliant AI could be entirely prohibited under EU AI Act.

2. Conformity Assessment and CE Marking

HIPAA requires no third-party certification. The EU AI Act requires high-risk systems to undergo conformity assessment (internal control or notified body), maintain technical documentation per Annex IV, and affix CE marking before market placement.

3. Human Oversight Requirements

Article 14 requires high-risk AI systems to be designed for effective human oversight, including the ability to interrupt or override. HIPAA has no specific human oversight requirements for automated systems.

4. Transparency Disclosures

Article 13 requires instructions for use, intended purpose, capabilities and limitations, human oversight measures, and performance metrics. HIPAA’s Notice of Privacy Practices doesn’t cover AI-specific disclosures.

5. Post-Market Monitoring

Article 72 requires systematic post-market monitoring to collect and analyze data on AI system performance throughout its lifecycle. HIPAA has no equivalent ongoing monitoring requirement.

6. Fundamental Rights Impact Assessment

Article 27 requires deployers of high-risk AI to conduct fundamental rights impact assessments before deployment. HIPAA risk assessments focus on privacy and security, not broader rights impacts.

Gap analysis: what the EU AI Act does not cover (HIPAA-specific)

Conversely, organizations with mature EU AI Act compliance will find gaps when applying HIPAA requirements.

HIPAA Requirements Not Covered by EU AI Act

1. Protected Health Information Definition

HIPAA’s specific definition of PHI (18 identifiers) and de-identification standards (Safe Harbor, Expert Determination) have no EU AI Act equivalent. The AI Act defers to GDPR for personal data, which uses different criteria.

2. Business Associate Agreements

HIPAA’s BAA requirements specify contractual obligations for PHI handling by vendors. The EU AI Act has provider-deployer contracts (Article 25) but not specific data processing agreements—those fall under GDPR.

3. Administrative Safeguards

HIPAA’s detailed administrative safeguards (security official, workforce clearance procedures, information access management, security awareness training) have no direct EU AI Act equivalent.

4. Physical Safeguards

HIPAA’s physical safeguards (facility access controls, workstation security, device and media controls) are not addressed in the EU AI Act, which focuses on system behavior rather than infrastructure.

5. Individual Rights Specific to Health Data

HIPAA provides specific rights: access to medical records, amendment of records, accounting of disclosures, restrictions on use. The EU AI Act’s Article 86 right to explanation is narrower than HIPAA’s health-specific rights.

6. Breach Notification Specifics

HIPAA’s breach notification (60-day window, media notification for 500+ individuals, annual HHS reporting) differs from EU AI Act’s 15-day serious incident reporting. Both may apply independently.

Evidence requirements comparison

Both frameworks require organizations to maintain evidence of compliance. Understanding what evidence satisfies which framework is essential for efficient governance.

Evidence Requirements by Framework

Evidence Type HIPAA EU AI Act Unified Approach
Risk Assessments Security risk analysis documentation Risk management system outputs (Article 9) Integrated risk framework covering both data and AI risks
Policy Documentation Written policies and procedures QMS procedures per Article 17 Single policy set with framework-specific sections
Access Logs PHI access audit trails System operation logs per Article 12 Comprehensive logging covering both access and operations
Vendor Agreements Signed BAAs Provider-deployer contracts Combined agreements addressing both requirements
Training Records HIPAA training completion AI literacy training per Article 4 Combined training program with both modules
Incident Records Breach investigation documentation Serious incident reports per Article 73 Unified incident management with dual reporting paths
Technical Documentation System security documentation Annex IV technical file Comprehensive technical documentation satisfying both

Compliance strategy for dual-jurisdiction operators

Organizations operating healthcare AI in both US and EU markets should adopt a unified approach rather than maintaining parallel compliance programs.

GLACIS logoGLACIS
Dual-Jurisdiction Strategy

Building a Unified Compliance Framework

1

Adopt a Base Framework

Start with ISO 42001 or NIST AI RMF as your foundational AI governance framework. These provide comprehensive structures that can accommodate both HIPAA and EU AI Act requirements. Map control requirements from both regulations to your base framework.

2

Implement the Stricter Requirement

Where requirements overlap but differ in stringency, implement the stricter one. EU AI Act’s 10-year documentation retention exceeds HIPAA’s 6-year requirement—use 10 years. HIPAA’s PHI access logging may be more specific than AI Act logging—use HIPAA’s specificity plus AI Act’s operational logging.

3

Address Framework-Specific Requirements

Build additional controls for requirements unique to each framework. EU AI Act requires conformity assessment, human oversight design, and post-market monitoring. HIPAA requires BAAs, specific individual rights handling, and healthcare-specific breach notification. These don’t overlap—you need both.

4

Create Unified Documentation

Maintain a single technical documentation set that satisfies both Annex IV requirements and HIPAA system documentation needs. Include framework-specific sections where required. Use consistent terminology and cross-references to demonstrate how controls map to both frameworks.

5

Establish Dual Reporting Paths

Implement incident management that can trigger both HIPAA breach notification (HHS OCR, 60 days) and EU AI Act serious incident reporting (national authority, 15 days). Train your response team on both pathways. An AI incident involving PHI may trigger both.

Strategic advantage: Organizations that build unified compliance frameworks position themselves for faster market entry in both jurisdictions, reduced audit burden, and more efficient ongoing governance. The investment in unified infrastructure pays dividends across all regulated markets.

How GLACIS helps satisfy both frameworks

GLACIS provides continuous attestation infrastructure that generates cryptographic evidence AI controls execute correctly. This evidence maps to requirements from both HIPAA and the EU AI Act.

HIPAA Evidence

  • Access control verification for ePHI
  • Audit logging with tamper-evident records
  • Encryption status attestation
  • Security control execution proof

EU AI Act Evidence

  • Article 12 automatic logging attestation
  • Risk management system execution proof
  • Human oversight control verification
  • Post-market monitoring attestation

Unlike policy-based compliance tools that document what should happen, GLACIS generates cryptographic proof that controls actually execute. This evidence satisfies both HIPAA Security Rule requirements for demonstrable controls and EU AI Act requirements for operational logging and monitoring.

Frequently asked questions

Does HIPAA compliance satisfy EU AI Act requirements?

No. HIPAA and the EU AI Act have different focuses and requirements. HIPAA addresses privacy and security of Protected Health Information (PHI), while the EU AI Act addresses AI system safety, transparency, and fundamental rights. HIPAA compliance provides a foundation for some data governance and security requirements, but does not satisfy EU AI Act obligations for risk management, conformity assessment, technical documentation, human oversight, or transparency disclosures.

If my healthcare AI is FDA-cleared, do I still need EU AI Act compliance?

Yes. FDA clearance does not satisfy EU AI Act requirements. However, if AI system qualifies as a medical device under EU MDR (2017/745), the EU AI Act provides for coordinated conformity assessment through existing notified body processes. You must still meet AI Act high-risk requirements (Articles 9-15), but the conformity assessment can be integrated with MDR certification. Medical AI devices have an extended deadline of August 2, 2027.

What documentation is required for both frameworks?

Both frameworks require extensive documentation, but with different focuses. HIPAA requires policies and procedures, risk assessments, BAAs, training records, and audit logs. The EU AI Act requires technical documentation (Annex IV), risk management records, data governance documentation, quality management system records, and conformity assessment documentation. Organizations should create unified documentation that satisfies both frameworks.

How do logging requirements compare?

HIPAA requires audit controls recording PHI access (who accessed what, when). The EU AI Act Article 12 requires automatic logging of AI system operation including inputs, outputs, and events relevant to identifying risks. EU AI Act logging is more specific to AI behavior (model version, inference details, decision traces), while HIPAA focuses on data access. Healthcare AI systems need both types of logging.

Which framework has stricter penalties?

The EU AI Act has higher maximum penalties. HIPAA civil penalties reach $2.1 million per year for willful neglect violations. The EU AI Act penalties reach €35 million or 7% of global annual turnover for prohibited AI practices, and €15 million or 3% of turnover for high-risk non-compliance. Both frameworks can result in operational restrictions and reputational damage beyond financial penalties.

Can I use a single governance framework for both?

Yes, and this is recommended. A unified AI governance framework based on ISO 42001 or NIST AI RMF can address requirements from both HIPAA and EU AI Act. The key is mapping controls to both frameworks, implementing the stricter requirement where they overlap, and adding controls for gaps. GLACIS helps organizations maintain a single source of truth for evidence that maps to multiple frameworks.

One trail, two regimes

Unified evidence for dual-jurisdiction healthcare AI.

The Glacis Agent Runtime Security & Evidence Sprint produces a single set of signed evidence receipts that satisfies HIPAA’s audit-controls requirement under §164.312(b) and Article 12 of the EU AI Act in one capture. Runtime controls run inside your infrastructure with zero sensitive-data egress. 10 business days, one named workflow, signed evidence pack on day ten.

Book the Agent Runtime Security Sprint See a sample evidence pack →

Related guides

EU AI Act series hubArticles, penalty structure, GLACIS coverage map.
Full compliance guideRisk categories, Articles 9–15, GPAI, conformity assessment.
For CMIOsClinical AI overlay: MDR/IVDR, Annex III §5, Article 12, Article 50.
For General CounselLiability allocation, vendor and deployer contracts, extraterritorial scope.
HIPAA-compliant AIPHI handling, BAAs, the Security Rule.
ISO 42001 guideAI management-system standard.