top of page
Search
  • Writer's pictureSarah Anderson

Finding Room for Artificial Intelligence Within HIPAA


Artificial intelligence is transforming medicine. According to two recent studies published in Nature Medicine and Cell, artificial intelligence (AI) learned to accurately read CT lung scans as a quick diagnostic tool to look for COVID-19 infection in patients. According to the Nature Medicine study, the AI system also “improved the detection of patients who were positive for COVID-19… after radiologists classified all of these patients as COVID-19 negative.”


The Health Insurance Portability and Accountability Act of 1996 (HIPAA) was originally enacted twenty years before artificial intelligence was meaningfully combined with medicine. Accordingly, and in many ways, AI and HIPAA can be at unintentionally odds with each other.


Well-meaning AI developers seek to provide healthcare professionals and their patients with faster and more accurate medical diagnoses, as well as less expensive procedures and diagnostic testing. However, to accomplish this mission, AI developers must have access to large quantities of data.


Machine learning and deep learning algorithms require large quantities of images, videos, or other real-world data to teach robotic brains to find patterns. In healthcare, these images, videos, and laboratory data are often considered “protected health information” (PHI) or its electronic version (ePHI), which HIPAA regulates from unauthorized transfers to third parties. Regardless of intent or output for public use, developers of medical AI struggle to access the data required to build necessary algorithms.


At this time, available solutions for AI developers to obtain the necessary quantities of PHI/ePHI are limited:


  1. De-identify the Data & Execute a Business Associate Agreement: Data may be shared with AI developers absent specific patient authorization and consent if the data can be anonymized or de-identified – meaning it cannot be equated to its original host. The developer must execute a “business associate” agreement with the “Covered Entity” (hospital, physician, health plan, or healthcare provider), in which the developer agrees to abide by the HIPAA security and privacy rule to protect the PHI. Still, the Covered Entity may limit the developer’s use of the PHI under the business associate agreement as desired.

  2. Use Written Authorizations: The HIPAA Privacy Rule requires an individual’s written authorization for any use of PHI not otherwise expressly permitted or required by the HIPAA Privacy Rule. For example, authorizations are not generally required to disclose PHI between treating physicians of the same patient because the “Covered Entity” is permitted to use and disclose PHI for that specific purpose (radiologist sharing CT results with oncologist). However, data shared by a Covered Entity with software developers to train new technology to spot early signs of malignancies is not covered by HIPAA and will require a separate authorization with each individual whose data is sought by the software developer. Often a herculean task, some organizations that partner with hospitals and healthcare centers include such authorizations in their standard registration paperwork to ensure all patients lawfully agree to donate their anonymized data to the AI developer.

  3. Consistent ePHI Access for Product Support: To support ongoing refinement of algorithms supporting patient health, developers may require consistent access to ePHI on a Covered Entity’s network. Again, such access will likely require a business associate agreement with the application of the additional HIPAA Security Rules specific to the confidentiality, integrity, and availability of ePHI. Largely self-explanatory, “confidentiality” is defined as ensuring that ePHI is not illegally disclosed without proper patient authorizations in place. “Integrity” requires that the ePHI will not be accessed, altered, or destroyed except by appropriate and authorized parties and the “availability” requirement allows patients constantly access to their ePHI (a right which is also being protected in the Cures Act Final Rule).


The HITECH Act – or Health Information Technology for Economic and Clinical Health Act, enacted in 2009, was intended promote and expand the adoption of health information technology, specifically, the use of electronic health records (EHRs) by healthcare providers. Unfortunately, HITECH potentially made HIPAA more punitive against AI developers.


HITECH called for mandatory penalties against HIPAA-covered entities and business associates that willfully neglect HIPAA Rules, allowing a maximum penalty of $1.5 million. The Department of Health and Human Services (“DHHS”) was given the authority to determine the subjective level of knowledge of HIPAA Rules by the alleged violator to determine if the alleged violator’s action constituted “willful neglect of HIPAA Rules.”


Furthermore, the HITECH Act created a new HIPAA Breach Notification Rule, under which covered entities are required to issue notifications to affected individuals within sixty days of discovering a breach of PHI. Breaches of 500 or more records also need to be reported to the DHHS within 60 days of the discovery and smaller breaches must be reported to DHHS within sixty days of the end of the calendar year in which the breach occurred. Often referred to as the “Wall of Shame,” the HITECH Act also permits the DHHS Office for Civil Rights to publish a summary of healthcare data breaches reported by HIPAA covered entities and their business associates.


Between the permissible penalties, public “Wall of Shame,” and breach notification requirements (which are particularly difficult for anonymized data), HIPAA and the HITECH Act can inadvertently discourage AI innovation in healthcare.

38 views0 comments

Recent Posts

See All
Post: Blog2_Post
bottom of page