HIPAA Encryption Requirements for 2026

In 2020, Lifespan Health System wrote a check to the Office for Civil Rights for $1.04 million. The reason: an unencrypted laptop was stolen from an employee’s car. The laptop contained records for 20,431 patients. Names, dates of birth, Social Security numbers, diagnoses, medications — all of it sitting on a hard drive with no encryption.

Here’s the part that should keep every practice administrator up at night: if that laptop had been encrypted, HIPAA’s safe harbor provision would have meant no breach notification was required. No OCR investigation. No settlement. No press coverage. No Wall of Shame listing. The entire $1.04 million penalty traces back to one missing setting that would have cost nothing to enable.

Lifespan isn’t an outlier. Children’s Medical Center of Dallas paid $3.2 million in 2017 over unencrypted devices containing patient data. MD Anderson Cancer Center was hit with $4.3 million in penalties for the same issue — unencrypted laptops and thumb drives — even though their own internal policy required encryption. They were just slow to implement it. The pattern repeats across years of enforcement history: organizations that skip encryption pay for it, and they pay enormously more than encryption ever would have cost.

This article breaks down exactly what HIPAA requires for encryption, what’s changing under the proposed 2026 Security Rule, and what you need to do right now to protect your practice and your patients.

What HIPAA Actually Says About Encryption

The HIPAA Security Rule addresses encryption in two places:

  • 45 CFR 164.312(a)(2)(iv) — encryption of electronic protected health information (ePHI) at rest. This covers data stored on hard drives, servers, backup media, USB drives, and any other storage medium.
  • 45 CFR 164.312(e)(2)(ii) — encryption of ePHI in transit. This covers data moving across networks: emails, file transfers, connections between your systems and your vendors.

Both of these specifications are currently classified as “addressable” under the Security Rule. And this is where most of the confusion — and most of the fines — originate.

“Addressable” does not mean optional. It never has. What it means is: you must evaluate whether the safeguard is reasonable and appropriate for your organization. If it is, you implement it. If it genuinely isn’t, you must document why and implement an equivalent alternative that provides the same level of protection. The third option — just skipping it — isn’t on the menu.

We wrote an entire article about this distinction because it’s the most expensive misunderstanding in HIPAA compliance: Why “Addressable” Doesn’t Mean “Optional”.

In practice, OCR treats encryption as the expected standard. When investigators show up after a breach and find unencrypted devices with no documentation explaining why encryption wasn’t implemented, the conversation goes badly. The burden is on you to justify the absence of encryption, not on OCR to prove you should have had it.

Encryption at Rest: Protecting Stored Patient Data

Encryption at rest means that data stored on any device or medium is unreadable without the proper decryption key. If a laptop is stolen, a server is compromised, or a backup drive is lost, encrypted data is useless to whoever finds it.

Full-Disk Encryption on Workstations and Laptops

This is the lowest-hanging fruit in all of healthcare security. Both major operating systems include free, built-in full-disk encryption:

  • BitLocker for Windows (included in Windows 10/11 Pro and Enterprise editions)
  • FileVault for Mac (included in every version of macOS)

Turning either of these on takes minutes. Once active, the entire hard drive is encrypted automatically. The user experience doesn’t change — staff log in the same way they always have. The only difference is that if the device is stolen or lost, the data on it is unreadable.

There is virtually no defensible reason for any practice to have unencrypted laptops or workstations in 2026. BitLocker and FileVault are free, built into the operating system, and transparent to end users. Every enforcement case involving an unencrypted stolen device is a case that didn’t need to happen.

Database and Application Encryption

If your practice runs on-premises servers or stores patient data in local databases, those databases should be encrypted using AES-256, which is the encryption standard recognized by NIST under FIPS 140-2. Most modern database systems (SQL Server, PostgreSQL, MySQL) support transparent data encryption that encrypts the entire database without requiring application changes.

For cloud-based EHR systems, your vendor handles this — but you should verify it. Ask your EHR vendor: “Is our data encrypted at rest? What encryption standard do you use?” The answer should be AES-128, AES-192, or AES-256. If they can’t answer, that’s a red flag worth digging into during your next risk assessment.

Mobile Devices

iPhones and iPads encrypt data by default when a passcode is set. This is one of the few areas where the default settings actually get you to compliance. The key is making sure every device that accesses patient data has a passcode enabled — no exceptions.

Android devices vary by manufacturer and version. Most modern Android devices (version 10 and later) encrypt by default, but older devices may not. If your practice allows staff to use personal Android phones to access email or patient portals, you need a policy requiring encryption and a way to verify it.

Backup Encryption

This is the one that gets overlooked. Practices diligently encrypt their laptops and servers, then back everything up to an unencrypted external hard drive that sits in an unlocked closet. Or they send unencrypted backups to a cloud service without verifying that the data is encrypted in storage.

Your backup media — whether it’s external drives, tape, or cloud storage — needs to be encrypted with the same rigor as your primary systems. An unencrypted backup is a copy of everything you worked so hard to protect, sitting in a format anyone can read.

USB Drives and Portable Media

The safest policy is to prohibit USB drives entirely for anything involving patient data. If your practice needs portable media for legitimate purposes, those drives must be encrypted. Manufacturers like Kingston and Apricorn sell hardware-encrypted USB drives that meet FIPS 140-2 standards. They cost more than a basic thumb drive from an office supply store, but they cost infinitely less than a breach.

Encryption in Transit: Protecting Data in Motion

Encryption in transit protects data while it’s moving between systems — across your office network, over the internet, or between your practice and your vendors.

TLS 1.2 or Higher for All Data Transmission

TLS (Transport Layer Security) is the protocol that encrypts data moving across networks. The current minimum acceptable version is TLS 1.2. Older versions (TLS 1.0, TLS 1.1, SSL) have known vulnerabilities and should be disabled on all systems.

If your EHR, patient portal, or billing system connects over the internet, verify that it’s using TLS 1.2 or higher. Most modern systems do this by default, but legacy systems may not. Your IT person can check this in a few minutes.

HTTPS for Web-Based Systems

Every web-based system that handles patient data — your patient portal, your cloud EHR, your practice management system — must use HTTPS, not HTTP. HTTPS is simply HTTP with TLS encryption layered on top. If the URL in your browser bar doesn’t start with “https://” when you’re accessing a system that handles ePHI, something is wrong.

This applies to your own systems and to your vendors’ systems. If a vendor’s portal doesn’t use HTTPS, that’s a compliance conversation you need to have with them immediately.

Email Encryption

Email is one of the most common ways patient data leaks, and one of the hardest to secure properly.

At a minimum, your email system should use TLS encryption between mail servers. Most major email providers (Microsoft 365, Google Workspace) do this by default when both the sending and receiving servers support it. But TLS between servers is opportunistic — if the receiving server doesn’t support TLS, the email goes out unencrypted and nobody gets notified.

For better protection, consider:

  • Portal-based secure messaging: The recipient gets a notification email and clicks a link to read the secure message in a web portal. This is how most HIPAA-compliant email services work (Paubox, Virtru, Hushmail for Healthcare).
  • End-to-end encryption: The message is encrypted from the sender’s device to the recipient’s device. More secure, but harder to implement and use.

The practical takeaway: if your practice sends patient information by email, you need an encryption solution beyond basic TLS. “We use Gmail” or “We use Outlook” is not sufficient documentation of your email encryption controls.

VPN for Remote Access

If staff access practice systems remotely — from home, from a satellite office, from a hospital — they should be using a VPN (Virtual Private Network) that encrypts the connection between their device and your network. This is especially important when staff connect from public Wi-Fi at coffee shops, airports, or hotels.

Wireless Network Encryption

Your office Wi-Fi network should use WPA3 encryption (preferred) or WPA2 at minimum. WPA and WEP are insecure and should never be used. Your clinical network (the one your EHR and workstations connect to) should be separate from your guest network, and both should require passwords.

The Safe Harbor Provision — Why Encryption Is Your Best Insurance

This is the section that makes the business case for encryption more clearly than any technical argument ever could.

Under 45 CFR 164.402(2), if protected health information is encrypted in accordance with NIST standards and the encryption key has not been compromised, a loss or theft of that data is not considered a reportable breach.

Read that again. If an encrypted laptop is stolen from your employee’s car, and the encryption meets NIST standards (AES-128, AES-192, or AES-256 under FIPS 140-2), you do not have to:

  • Report the incident to OCR
  • Send notification letters to every affected patient
  • Notify the media (required for breaches affecting 500+ individuals)
  • Appear on OCR’s Breach Portal (the “Wall of Shame”)
  • Undergo an OCR investigation

The financial implications are staggering. Breach notification alone costs an estimated $100-$400 per affected record when you factor in mailing costs, call center setup, credit monitoring services, legal review, and public relations. For a breach affecting 10,000 patients, that’s $1 million to $4 million just in notification costs — before any fine is assessed.

The average healthcare data breach cost $10.9 million in 2024, making healthcare the most expensive industry for breaches for the fourteenth consecutive year.

Encryption costs virtually nothing by comparison. BitLocker and FileVault are free. TLS is built into every modern system. Database encryption is a configuration setting. The entire cost of encrypting a small practice’s infrastructure might be a few hundred dollars in IT time.

Look at it from the other direction: Children’s Medical Center of Dallas paid $3.2 million in fines over unencrypted devices. The cost of encrypting every device in their organization would have been a fraction of 1% of that fine. Encryption is the single most cost-effective compliance investment any healthcare organization can make.

What the Proposed 2026 Security Rule Changes

Everything above describes the current rules. What’s coming is even more straightforward.

The proposed HIPAA Security Rule update, published as a Notice of Proposed Rulemaking on January 6, 2025, eliminates the “addressable” category entirely. Under the proposed rule, encryption of ePHI at rest and in transit moves from “addressable” to “required”. No more documented exceptions. No more “we evaluated it and decided it wasn’t reasonable.” You encrypt, or you’re noncompliant.

The proposed rule specifically calls out AES-256 for data at rest and TLS 1.2 or higher for data in transit.

The final rule is expected around May 2026, with a 240-day compliance window after publication. That puts the hard deadline somewhere around January 2027.

Organizations that implement encryption now are ahead of the mandate. Organizations that wait will be scrambling to encrypt every device, every database, every email system, and every data connection in their environment before the clock runs out. That’s not a project you want to rush.

If you haven’t reviewed the full scope of what’s changing, our breakdown covers all seven major changes: The New HIPAA Security Rule Is Coming: 7 Major Changes for 2026.

Encryption Checklist for Small Practices

Use this checklist to audit your current encryption posture. For each item, document whether it’s in place, and if not, what your remediation plan is. This documentation becomes part of your risk assessment.

Laptops and Desktops

Mobile Devices

Email

Cloud Storage and EHR

Backups

USB Drives and Portable Media

Wireless Networks

Remote Access

Data in Transit

Every unchecked box on this list is a gap in your security posture — and a potential finding in an OCR investigation. The good news is that most of these items are straightforward to implement, and many are free. The hardest part is doing the inventory and making sure nothing gets missed.



Need help getting your encryption and security controls in order? One Guy Consulting offers affordable HIPAA compliance packages for practices of all sizes. One Guy Consulting HIPAA services