How data privacy impacts your copay programs
Legal experts talk privacy laws and considerations for healthcare data and copay models.
Privacy laws are complicated, and there are tons of them in the US that apply to copay programs, according to Jami Vibbert, chair, privacy and cyber, at Arnold & Porter.
“There are general laws that apply specifically to privacy. The FTC enforces its unfair and deceptive acts and practices laws in the privacy and data security space,” she told the audience at the Copay, Reimbursement and Access Congress in Philadelphia. In addition to general privacy laws, there are sector-specific laws such as HIPPA and COPPA, along with their state-level counterparts.
“And then lucky us, we now have 19 different states that have broad consumer privacy laws that will apply to your copay program,” Vibbert said, adding that more states will be added by the end of 2024.
Jon Westlund, senior privacy counsel at Pfizer, noted that just three years ago these state-level privacy laws did not exist.
“I think the big picture for this is things are changing especially in the United States at a rather rapid rate,” he said.
HIPPA
Westlund spoke on HIPPA and the wide misunderstanding of the compliance, explaining that it is an insurance discrimination law and applies to healthcare providers and their vendors who bill insurance.
“It was never intended to be a privacy law. The privacy protections in HIPAA were intended so that insurance couldn’t discriminate against you, and that’s why it operates differently than any other privacy law that I've ever seen anywhere across the globe.”
To fill in the gaps left by HIPPA, states have enacted their own privacy laws, with provisions such as explicit consent, data transparency, and individual rights, he explained.
Granular consent: The new standard
Companies need clear consent to collect user data necessary for administering copay programs.
State laws are shifting away from the “all or nothing” approach when it comes to data processing, according to Westlund, moving toward granular consent.
Regulations now demand specific individual consent for each action involving user data, such as collection, sharing, and marketing communication.
“If you want to process sensitive data, you must take some type of affirmative act. You can't have a pre-check box,” he said.
“The person has to go in there and actually do something like ‘I read this, I understand this.’ Whether or not they read this, understand that’s on the individual user, but it's on us, at least industry wide as the business, to make sure that that person’s choice is respected.” This refers to privacy rights in which the person has the right to know what information is held about them and the ability to request to delete or erase the data.
“And on the privacy rights, it’s important to understand what those rights are when you’re building your data repository,” Westlund said.
Consent is generally not required for sharing data with vendors who are helping provide the programs, he noted.
However, “if you have a collaboration partner, you’re pairing a device with another device, or you have a diagnostic, you have some sort of a partner, you probably need consent to disclose to that third party.”
He also noted that consent shouldn’t last forever. Westlund suggested that a one-year consent period is a reasonable starting point. “Two years, you’re probably OK. Three years, you’ve probably tipped it,” he cautioned, noting that regulators look at companies that are collecting and storing user data indefinitely.
“That’s the mentality that the regulators don't want to see. They want to see that this consent is only good for the next 30 days,” he said.
The role of AI
When using AI or analytic tools with copay data, there may be additional factors to consider, according to Vibbert.
She said there is only one state right now that has an AI rule, but globally, “there’s more AI regulation … Whether enforcement will still happen at the federal level, unsure, but it will definitely happen at the state level.
“So you need to be a little careful about your use of AI with these data sets,” she continued. “There’s a lot of potential, but there’s risk that is super dealable in the same way that privacy risk is handled.”
Some of the considerations Vibbert noted include:
• The need to provide notice
• The need to specific contractual provisions and protections in place with the AI or analytics vendor
• The need to conduct certain governance activities
Vibbert also emphasized the importance of transparency and how organizations should inform users, offer opt-out options, and possibly provide a separate consent.
“You got to tell people. Nobody’s going to read it, but you got to tell them anyway,” she said.
From a privacy perspective, Vibbert said, “De-identification and anonymization of personal data prior to ingesting it in the AI is going to be the key to reducing risk and facilitating additional use without consent.”
Quotes have been lightly edited for clarity.
Deposit Photos/masha_tace