The Importance of Cybersecurity in Healthcare with AI and Connected Devices

PA at computer contemplating importance of cybersecurity in healthcare

Key Takeaways on the Importance of Cybersecurity in Healthcare

  • AI and connected devices expand cyber risk — every system that processes patient data creates potential entry points for attacks.
  • Vendor security practices form the foundation of healthcare cybersecurity — HIPAA compliance, encryption, certifications, and BAAs help protect PHI when using AI tools.
  • Staff access controls and device policies play a critical role in security — permissions-based access, MFA, and managed devices reduce risk from phishing and user error.
  • Organizational policies and preparedness strengthen cybersecurity resilience — clear data governance, audit logs, incident response plans, and external support help manage AI-related threats.

Connected devices as part of connected care and AI are a common part of the  healthcare industry in both urban and rural areas. But they open new potential windows for security attacks. This article breaks down the importance of  cybersecurity in healthcare today with the rise of AI and connected devices.

AI can play two different roles in your healthcare organization. You can use stand-alone AI tools or integrate tools with your EHR or imaging machines to add efficiency. Staff can access AI tools on their own too, think ChatGTP or Gemini. 

AI, Connected Devices, and Cyber Security in Healthcare

AI can process a lot of patient data. But every AI touch point can introduce an entry point for cyber attacks against access to sensitive patient data. And attacks have consequences — 84% of ransomware attacks on rural hospitals from 2016 to 2021 created serious operational disruption. Those included system downtime, scheduling interruptions, and more. Add connected tablets, notebooks, and/or phones and security gets even more complex.

Where typical breaches often involve stealing data, AI can let attackers do even more — from automating attacks targeted at staff to cracking passwords to impersonating real users. It can even remodel data, known as model poisoning. In poisoning, attackers train data to deliver fake results that can affect patient health and safety, billing accuracy, and more.

And hackers or scammers often see smaller practices or rural hospitals or clinics as easy targets. 

But data security challenges from AI and connected devices are manageable. Azalea Director of Infrastructure and Security Hector Valera shares what to know and what you can do to prepare for and tackle cybersecurity challenges in the age of AI and connected devices.

The Importance of Cybersecurity in Healthcare Starts with Your Vendor

AI systems can process a lot of patient data, especially when looking for trends across a database. Keeping data safe starts with ensuring your AI tool or vendor is compliant and doing its own due diligence against potential threats. Hector suggests:

  • Make sure your vendor’s technology and processes are HIPAA-compliant. That’s true even if you’re letting staff use ChatGPT or Gemini (more below). 
  • Use encrypted data storage and transmission. If you store data onsite or transmit it offsite, make sure you use modern encryption and a secure web connection. Ask vendors about their encryption at-rest and in-transit practices. Some cloud services, including Azalea Health, let you manage your own encryption keys. This means only you can unlock your data, and even the cloud provider can’t.
  • Make sure your vendor has security certifications, specifically SOC 2, HITRUST, and ONC, and that they update, monitor and validate their model(s) regularly to avoid model poisoning.
  • Make sure to have a business associate agreement (BAA) in place with any AI vendor whose tool interacts with PHI. That’s true even if you let staff use PHI in ChatGPT or another solution. Getting a BAA with ChatGPT or other vendor requires added steps — and costs — beyond the normal monthly subscription. Be sure to ask what’s needed.

Ensure Security at the Staff and Device Level

Hector says that one of the best ways to keep patients and your facility safe when using AI, and in general, is to stay in control. Limit access to data to the bare minimum. That’s true for users and systems. 

Use only AI and other tools that include permissions-based access controls. For example, a nurse doesn’t need to see a patient’s billing information and front-office staff don’t need to see a patient’s medical history. Your solution should let you control who can see what and what they can do with what they can access. Your vendor should be able to help set this up or, at the very least, train you how to do it. 

Many breaches result from user error or phishing attacks. Don’t let staff and providers access hospital tools with their own devices without controls in place, such as secure vendor apps, multifactor authentication (MFA), or mobile device management software controlled by your IT team. Only let people who need it, access facility tools on their own devices. And make sure they understand if they use their own ChatGPT or other tool, they can’t use PHI in it.

Also make sure staff and providers understand what the risks are and why it’s important to protect data. Make sure they’re aware of ransomware and phishing threats that might come through their work and personal devices. The AHA, CISA, HHS 405(d), and others offer free security training resources. Microsoft has a free Microsoft Cybersecurity Program for Rural Hospitals.

Mind Your Organizational Ps and Qs

Along with training staff and setting rigid controls, have organizational best practices in place. Hector breaks best practices into four key areas:

  1. Define your regulatory and liability issues — clearly understand who owns a failure, you or the AI vendor. If it’s the vendor, make sure responsibility is outlined in your contract. Also make sure you and your vendor stay up on evolving regulations around healthcare and the use of AI, such as those set by NIST and HHS. Note that the HIPAA Security Rule may change in 2026 to include stricter requirements around cybersecurity. It proposes mandating MFA, encryption, enhanced BAAs, annual audits and self assessments, and more.
  2. Document data governance — write and maintain a detailed policy on who owns, accesses, and manages data used by your systems, devices, staff, and AI tools. 
  3. Use audit logs or ensure your AI vendor does — ensure all interactions that your AI tools, staff, and providers have with PHI are logged and can be reviewed whenever needed.
  4. Create an incident response plan that includes AI-related threats. And test your response to a potential breach or malfunction against your plan so you’re prepared. As part of your plan include how you’ll let patients know about a breach and the steps you’ll take. Share this plan on your website or in your privacy policy.
  5. Have cyber liability insurance as part of your overall security strategy. Insurance can help cover you and your patients after breaches or ransomware attacks and cover your regulatory fines, legal costs, and more.

Manage the Challenge of Cybersecurity in Healthcare

This all sounds like a lot, and it can be, especially for ambulatory clinics and rural hospitals with single-personal or small IT teams. A few ideas Hector shares to make preparing for and managing for AI-based cyberthreats easier are to:

  • Prioritize cybersecurity basics, such as patching and backing up local hardware and devices, using MFA for all logins, and training staff and providers to recognize attacks.
  • Lean on technology grants for rural and small providers. Google offers its Google’s Rural Healthcare Cybersecurity Initiative and Microsoft offers its Microsoft Rural Health Resiliency Program for rural hospitals. Your state State Office of Rural Health (SORH) may be able to point you to more options as well. 
  • Consider turning to a managed security service provider (MSSP) or regional health IT coalition for support. Ask your HIMSS chapter, regional extension center, state or local HIE, or state department of health for possible options and organizations.

Azalea Understands the Importance of Cybersecurity

The Azalea Hospital EHR and Ambulatory EHR undergoes annual third-party security audits including SOC 2 and HITRUST certification reviews.  Azalea also takes multiple steps to ensure HIPAA compliance, including using access controls, encryption, firewalls, regularly releasing security patches, using audit trails, and having a standing business associate agreement (BAA).

And Azalea AI Clinical Assistant and AI Billing Assistant both operate safely inside the Azalea EHR, so they use the same protections. 

Get More Like This in The Definitive Guide to Rural Healthcare