Report cites "inexcusable" failings in how UK health data handled in DeepMind, NHS partnership

A report published Thursday in Health and Technology examining a partnership between Google's DeepMind subsidiary and the Royal Free London NHS Foundation Trust said "the failure on both sides to engage in any conversation with patients" regarding transparency and oversight when handling sensitive medical information "is inexcusable." The authors noted that while the collaboration was initially "received with great enthusiasm, [it] has suffered from a lack of clarity and openness, with issues of privacy and power emerging as potent challenges as the project has unfolded."

DeepMind first made the partnership public in February 2016, saying it was working with the NHS to build the Streams smartphone app to help clinicians manage acute kidney injury (AKI), but made no mention of the volume or type of data included in the transfer, which the report says amounted to "millions of identifiable personal medical records." In fact, the authors found that "sensitive medical data on millions of Royal Free's patients started flowing into third-party servers contracted by Google to process data on behalf of DeepMind" as early as November 2015. An investigation by New Scientist last year estimated that the agreement allowed DeepMind to analyse sensitive information on about 1.6 million patients each year, including HIV status, details about drug overdoses and abortions, as well as records of routine hospital visits.

Julia Powles, who co-authored the report, said the partnership "betrays a level of naivety regarding how public sector organisations set up data-sharing arrangements with private firms, and it demonstrates a major challenge for the public and public institutions." She pointed out that "in this case DeepMind, a machine learning company, had to make the bizarre promise" that it would not yet use any machine learning or artificial intelligence techniques to build the Streams app "in order to engender trust."

The authors noted that while DeepMind's access likely did not represent a data security risk, the terms of the agreement were questionable as they lacked transparency, and had an inadequate legal and ethical basis for Trust-wide data access. They also raised doubts about how DeepMind and the NHS claimed their collaboration was entirely founded on so-called "direct care" for AKI, the notion that an "identified individual" has given implied consent for their information to be shared for uses that involve the prevention, investigation or treatment of illness. According to the report, no patient whose data were shared with DeepMind was ever asked for their consent. Further, the authors pointed out that according to the Royal Free and DeepMind's own announcements, only one in six of the records DeepMind accessed would have involved AKI patients.

Last November, DeepMind and the NHS announced that they had expanded their joint project into a five-year partnership, adding that they expected the first version of the Streams app to be launched across Royal Free hospital sites early this year. Meanwhile, the original deal is being investigated by the UK's Information Commissioner's Office, which has yet to report any findings publicly, while the UK's National Data Guardian is also continuing to look into the arrangement.

Commenting on the Health and Technology report, DeepMind and the NHS issued a joint response claiming that the paper misrepresented the NHS's use of technology to process data and that the analysis contained several errors.

Did you like this article?