AI Productivity Tools and PHI

Jan. 19, 2024

With the rise of artificial intelligence (AI) productivity tools COM ITS is also experiencing a rise in requests for AI productivity tool installation.  As an officially designated Healthcare Component (HCC), COM-T and UAHS are required to follow federal HIPAA regulations.  One of the many requirements of HIPAA is to conduct risk assessments on any software used in conjunction with ePHI (store, process, transmit, or access), and to have a Business Associate Agreement (BAA) in place.  This includes AI productivity tools.  HIPAA third party risk assessments are a detailed analysis of the security procedures and controls for the product and vendor and demonstrate the UA's due diligence in ensuring the product and vendor satisfy, with reasonable certainty, the data protection requirements of the HIPAA Security Rule.

Many AI tools' terms of service include access to, storing, and even sharing the user's data with other partners/platforms in an effort to provide additional capabilities and to build on AI and machine learning (ML) technologies at partner organizations.  This usually means they download, store, and process the user's data for analysis to provide the expected functionality, or offload some of that data to their partners for similar purposes. 

The University's HIPAA Privacy Program (HPP) Office has not conducted any risk assessments against AI products as of January 2024.  This includes the use of technologies such as Microsoft Copilot.

The UA Chief Privacy Officer is aware of the increasing usage of AI products across campus.  There has been some guidance regarding the use of AI outlined on the UA Artificial Intelligence website:  https://artificialintelligence.arizona.edu. The Resources and FAQ page contains the current general information/guidance.

 

Of note on this page is the following FAQ: 

How are federal agencies reacting to the use of generative AI in research proposal review?  

Currently, the National Institutes of Health (NIH), among other federal agencies, does not allow generative AI usage for grant proposals or peer reviews. Researchers are urged to stay updated on current policies of federal agencies regarding the use of generative AI. This ban hinges on the confidential nature of the peer review process. Uploading proposal information violates that. Other federal agencies are expected to follow suit.  

 

As the University starts to formalize the AI Access & Integrity Working Group, Steering Committee, and Working Groups, expect additional guidance to be forthcoming.  As for now, most AI tools will not be authorized for installation due to the data privacy requirements and the required assessments.  For now, the HPP Office is the lead on all AI HIPAA assessments.