An NHS hospital did not do enough to protect the privacy of patients when it shared data with Google, the UK’s Information Commission (ICO) has ruled. The ICO censured the Royal Free NHS Foundation Trust about data handed over during trials of a novel way to detect acute kidney injury.
Among other failings, the ICO said the hospital did not tell patients enough about the way their data was used. Details on about 1.6 million patients was provided to Google’s Deep Mind division during the medical trial. Information was used to develop and refine an alert, diagnosis and detection system for acute kidney injury. The trust said it would tackle “shortcomings” in its data-handling.
Elizabeth Denham, UK Information Commissioner, said: “I believe it is important to underline that as national data guardian, I am a strong advocate of work to develop new technology which can improve care and save lives. In this case, the Royal Free and DeepMind developed an app to alert hospital nurses and doctors to inpatients who might have acute kidney injury, a very serious condition, which can be hard to diagnose but can develop rapidly.
“The issue that concerned my panel members and me was not that innovation was taking place to help patients affected by this condition. Far from that, it was the legal basis which the Royal Free had used to share data which could identify more than 1.6 million patients to DeepMind.
“In this instance the Royal Free shared the information on the basis of ‘implied consent for direct care’. This is a legal basis that doctors, nurses and care professionals rely on every day to share information in order to make sure the individuals they are looking after receive the care they need. However, it is my view that this legal basis cannot be used to develop or test new technology, even if the intended end result is to use that technology to provide care.
“I’m afraid that a laudable aim – in this case developing and testing life-saving technology – is not enough legally to allow the sharing of data that identifies people without asking them first. We need to reassure the public there are always strong safeguards in place to make sure that confidential information will only ever be used transparently, safely and in line with the law and regulatory framework.”
In blog post, Denham outlines four lessons other health trusts can learn: ‘It’s not a choice between privacy or innovation,’ ‘Don’t dive in too quickly,’ ‘New cloud processing technologies mean you can, not that you always should,’ and ‘Know the law, and follow it’.
Following the ICO investigation, the Trust has been asked to:
- establish a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials;
- set out how it will comply with its duty of confidence to patients in any future trial involving personal data;
- complete a privacy impact assessment, including specific steps to ensure transparency; and
- commission an audit of the trial, the results of which will be shared with the Information Commissioner, and which the Commissioner will have the right to publish as she sees appropriate.