ChatGPT, an AI-powered chatbot that can converse naturally, is undoubtedly a powerful productivity tool. But while the software has taken several industries by storm, it’s important to know if it can be used to handle protected health information (PHI) in compliance with HIPAA regulations.
Let’s discuss ChatGPT and HIPAA compliance in depth.
Table of Contents
Why HIPAA Compliance Matters for Chatbots
Ever since an early demo of ChatGPT was launched on November 30, 2022, each update and controversy on the AI tool has never failed to make the news. Fears have arisen that artificial intelligence may soon replace doctors, so much so that the issue even merited a study on Digit Health entitled Artificial Intelligence in Healthcare: Complementing, not replacing, doctors and healthcare providers.
Still, there’s no denying that ChatGPT is revolutionary and could be the next major transformative technology after the Internet. It has already changed many sectors across industries, increasing efficiency and productivity. Forbes lists several important uses of the AI-powered chatbot in healthcare, including virtual assistance, clinical decision support, medical translation, and disease surveillance.
However, all healthcare professionals must exercise caution when using the AI chatbot. As the University of Southern California warns, doctors using ChatGPT and entering electronic PHI into the platform may unknowingly violate HIPAA.
Add to that the massive data breach that the conversational AI suffered from June 2022 to May 2023, as reported by Search Engine Journal, resulting in 100,000 accounts being compromised and sold on the dark web. OpenAI, the developers of ChatGPT, remedied the situation. However, you should still be careful and learn all you can about its HIPAA compliance before using it for your practice.
Is ChatGPT HIPAA Compliant?
Yes, ChatGPT can be used in a HIPAA-compliant way, but with one important caveat: you should be able to secure a Business Associate Agreement (BAA) with ChatGPT for Enterprise. Unfortunately, this is where it gets tricky.
According to ChatGPT Enterprise FAQs, OpenAI can sign a BAA in support of HIPAA compliance. However, several users have complained on OpenAI’s community discussions they cannot get ahold of the sales team for the BAA (see: Help with OpenAI HIPAA Compliance and HIPAA consent).
ChatGPT Enterprise has several privacy measures supporting HIPAA, such as zero data retention (ZDR), limited and controlled access, enterprise-level authentication, AES-256 encryption, and TLS 1.2 encryption. ChatGPT Enterprise and the API also don’t use any of your data to train the conversational AI. But without a BAA, this discussion is moot. You cannot achieve ChatGPT compliance with HIPAA if OpenAI does not sign a BAA.
Using ChatGPT for HIPAA Compliance Without a BAA
Unfortunately, some users cannot secure a BAA from OpenAI’s sales team. However, is there a way to use ChatGPT in a HIPAA-compliant manner even without a BAA? There may be a few workarounds, but note that these methods might be limiting:
Look for a third-party HIPAA-compliant provider that hosts ChatGPT
Third-party services that comply with HIPAA are harnessing the power of ChatGPT into their own software. Try getting a BAA from these services. For instance, EricGT, in the OpenAI forum, posted that he was able to get a BAA for a Microsoft Azure AI studio account, which uses GPT 4.
Avoid entering any form of electronic PHI
Refrain from entering data that could be classified as protected health information. This includes patient names, billing information, email addresses, lab results, telemedicine consults, etc. You can use several data anonymization techniques to protect your data when using ChatGPT. However, without accurate input, you may not always get the most accurate responses from the AI.
Follow HIPAA protocols in your healthcare practice
Adhering to HIPAA rules also entails following all mandated safeguards. You cannot achieve compliance just by using HIPAA-compliant tools or signing a BAA. You also have to practice strict data privacy and security measures according to the guidelines set by federal law.
ChatGPT Alternatives for Ensuring HIPAA Compliance
The popular AI chatbot offers HIPAA compliance for Enterprise and API users. However, some users report difficulty in securing a BAA with OpenAI. Without this legal document, you cannot and should not use ChatGPT to store or handle any PHI. If a data breach occurs and you don’t have a BAA, you might be putting your practice at a bigger legal risk.
Protecting a patient’s private data should take priority over efficiency or productivity in healthcare. Ideally, you should be able to provide both. But if you have to choose, choose data privacy first.
Moreover, you can explore chatbot alternatives to ChatGPT that can provide HIPAA compliance and respond promptly to your requests for a business associate agreement. Remember, it’s always better to err on the side of caution instead of risking exposing your patients’ data to cyber criminals and suffering the harsh penalties of a HIPAA violation.