Shortcomings in patient data privacy have been found by the ICO on a case involving the Royal Free NHS Foundation Trust’s use of Google DeepMind.
In a statement, the ICO said that the Royal Free NHS Foundation Trust failed to comply with the Data Protection Act when it provided patient details to Google DeepMind of around 1.6 million individuals as part of a trial to test an alert, diagnosis and detection system for acute kidney injury.
The details of the incident only became public knowledge in February 2016, around three months after the data sharing had occurred. In an undertaking signed by Sir David Sloman, chief executive of the Royal Free London NHS Foundation Trust, the ICO claimed that while only “partial patient records” were shared, they did contain “sensitive identifiable personal information held by the Trust.”
It claimed: “All development and functional testing of the application and the related technology platform was undertaken by DeepMind using synthetic, non-personally identifiable, data. Pseudonymisation of the patient identifiable data was not undertaken for clinical safety testing. This is because the Trust was (and remains) of the view that it needed access to patient records in the application and technology platform in order to undertake clinical safety testing.”
Among the failings, the ICO determined that patients were not adequately informed about the use of their data. Elizabeth Denham, information commissioner, said: “There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.
“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening.”
Rowenna Fielding, data protection lead at Protecture, told Infosecurity that there is an interesting intersection here between data protection law and confidentiality law - even where there is a suitable legal basis for processing identified to satisfy data protection law, there is still a requirement to provide adequate information to the data subjects about how their personal data will be used.
“Where the data includes medical records, the duty of confidentiality (expressed in healthcare by the Caldicott Principles) require that consent is needed for any use of the data for non-direct-care purposes,” Fielding added.
“So, there are two flavors of consent in play here - consent for processing the data, which may not be needed if there is a suitable alternative legal basis, and consent for disclosure to satisfy the duty of confidentiality. According to the Undertaking, both the ICO and the National Data Guardian are unsatisfied that there is a suitable legal basis for processing (whether consent - which was not obtained - or any other basis), that insufficient privacy information was given to patients whose data was used, and that consent for re-use of the information for a non-direct-care purpose (which was needed to satisfy the duty of confidentiality) was not obtained when it should have been.”
Fielding admitted that consent for processing data may look like the "easier" option, but actually requires a lot of administrative and technical overhead to obtain, maintain and refresh to meet the standards of law. It may also be unsuitable where there is a legal requirement to process the data, an imbalance of power between the organization and the data subject or where it would be too difficult to honor withdrawal of consent.
“Organizations who wish to re-use sensitive/special categories of personal data need to carefully examine the purposes of processing, identify the suitable legal basis and design their systems, processes and forms to support their compliance requirements.”