The promise of big data may be more critical than ever with the COVID-19 scourge ravaging the globe, but collection of those data requires the cooperation from ordinary citizens and their smart devices. A recent Senate hearing raised the question of whether privacy and confidentiality are at risk when software is installed in these devices for disease surveillance purposes, but there may be no absolute guarantee of confidentiality, jeopardizing the goodwill of citizens who are wary of big government.
Americans split on value of tracking
A poll conducted by the Pew Institute suggests that Americans are less than overwhelmingly optimistic that tracking of positive diagnoses via cell phones will do much to curb the pandemic. Roughly 60% of respondents said such tracking will not make much of a difference, while only 16% believe such tracking will make a substantial difference.
When asked whether it is acceptable for government to conduct such tracking, the numbers are even more badly split, with 48% finding such tracking unacceptable and 52% believing it is. The acceptability of tracking of those who have not been diagnosed ebbs even further when respondents are queried as to whether tracking is acceptable merely for the purpose of ensuring citizens are complying with guidelines on social distancing, with 62% finding this unacceptable.
An April 9 hearing of the Senate Commerce Committee took up the issue of the role of big data in the fight against the SARS-CoV-2 virus, and was appropriately conducted as a paper-only hearing. The witnesses offered their opening statements and responses in a written Q&A format, with Sen. Roger Wicker (R-Miss.), noting that Congress has authorized the Centers for Disease Control and Prevention to develop “a modern data surveillance and analytics system” to keep tabs on the proliferation of the virus.
Wicker pointed out that the private sector is already using mobile data to track the spread of the pandemic in the U.S., and that European Union nations have likewise expressed an interest in such electronic tracking. While the protections provided by the EU’s General Date Protection Regulation (GDPR) does not apply to anonymized data, EU officials are said to have committed to deleting all such data after the crisis has passed.
Still, Wicker said, uniform national privacy legislation is needed for matters not governed by the Health Insurance Portability and Accountability Act (HIPAA), although he stopped short of citing California as a state that has passed legislation dealing with these questions. The California Consumer Privacy Act was enacted in 2018 and went into force in January 2020.
Graham Dufault, senior director of public policy for ACT | The App Association, said in response to the committee’s queries that contact tracing does indeed present some privacy challenges, in part because the device that carries the tracking application can be associated with someone who has been diagnosed as COVID-positive. Dufault said the Massachusetts Institute of Technology has developed a Bluetooth-enabled system that would cause the cell phone of a person with a positive diagnosis to send a “chirp” to the likewise-enabled phones of others nearby when they come within a certain proximity of that test-positive user.
There is the hazard that geolocation data thus generated could be abused by those in government, but federated learning and differential privacy are two approaches that could be deployed to ensure the user remains anonymous, Dufault said. In his written testimony, he said that some of these data collection and usage questions fall outside the borders of HIPAA, suggesting a need for national legislation, particularly given that state law is moving ahead.
Dufault told BioWorld that the utility of data collected in this manner probably requires both large volumes and a significant portion of the population to be of much use. The underlying task is to generate “a good sense of how quickly positive cases are spreading,” he said, adding that the task of enrolling sufficient numbers “is going to be a huge challenge.”
Skepticism, but justification
“I think you’re right, that there is a skepticism of government tracking of citizen’s movements,” Dufault, said, but he pointed out that emergency situations have prompted Congress to pass legislation, such as the Patriot Act, that U.S. citizens more or less accepted without an overwhelming level of controversy. “People are willing to make allowances in terms of giving government access to certain pieces of information” when circumstances warrant such allowances, but Dufault said this requires that measures be taken to ensure those data do not become permanently available when the collection of those data are authorized only for an emergency.
Dufault said there is something of an unspoken contract between consumers and developers of software used on computers and mobile devices. “There a general rule that the consumer is going to get something in return” for the data, a rule of thumb that applies in all societies, he said. This is a transactional expectation that may be greater in the U.S. than in other nations.
Dufault said there are two different types of apps under discussion, a location tracking app and the Bluetooth-enabled app such as the app developed by the Massachusetts Institute of Technology (MIT). The location tracking app might keep location data on the device and transmit in an anonymized manner. In this scenario, the user's phone might ping other phones to let them know there is someone else nearby that uses the app also. Whether this will include a user’s diagnostic status depends on the developer’s objectives. Conversely, the type of Bluetooth app developed by MIT might enable the user to determine the duration of their proximity to another user that has tested positive, but again, this is up to the developer of the app.
Even in the case of the Bluetooth app, signal interference – the patient’s body and the walls of buildings are two sources – may impede the signal and consequently scramble the utility of the duration-of-proximity function.
Dufault said Apple and Google have both decided to limit developer access to the application programming interfaces on their phones to health care systems and government agencies in a public health setting. This, he said, is motivated in part by an interest in limiting the risk of compromised privacy/confidentiality, but Apple and Google may also want to avoid presenting the consumer with an unwieldly number of options. “You want to be able to present them with a trusted source,” rather than include every developer who can slap together an app, he said.
Still, Dufault cautioned, “you’re going to have to make people aware of the app, because you don’t want a situation where the only people who use the app are those who have tested positive.” He said, “it will be interesting to see how that’s carried out,” but it will, at a minimum, require some sort of prompt for people to download the app even in the absence of a positive diagnosis.
Not informed consent, but clear about data to be gathered
Dufault said the process of downloading the app will never be confused with the type of consent given for participation in clinical trials, but the download process should be explicit about the nature of the information that will be collected. This is one of the reasons the group has emphasized the need for legislation that gives the Federal Trade Commission more specific authority to handle non-HIPAA privacy breach questions.
“What we’re encountering is that there is an increasing complexity in the Internet ecosystem” in terms of both data processing and data sharing relations between companies, Dufault said. More and more data are generated for health care in a manner that is not addressed by HIPAA, but the prospect of multiple state regulatory requirements is a source of concern for developers.
“We’re supportive of a provision that expressly states that provisions of federal law preempt provisions of state law,” Dufault said, and while that preemptive function cannot be absolute, it should suffice to prevent the states from overriding FTC’s enforcement work. “you just need that level of certainty that what you’re doing as a small company is compliant with privacy requirements” in all 50 states, he said.