Building trust whilst safeguarding individuals

1st November 2017

We recently published Intelligent sharing: unleashing the potential of health and care data in the UK to transform outcomes – a report examining how the UK might support data-driven research and innovation to transform health and care. It makes plain that, to achieve this, the UK needs to blaze a trail in the development of ‘data ethics’ to proactively build trust whilst safeguarding individuals.

Data Protection Bill

We welcome the Government’s new Data Protection Bill which aims to modernise UK data protection laws to make them fit for purpose in our increasingly digital economy and society, whilst applying the standards outlined in the EU General Data Protection Regulation.

We would, however, wish to draw attention to a number of issues where aspects of the Bill would benefit from further interrogation and/or amendment:

1. Tackling data-driven exploitation and discrimination to build public trust

A. Tackling data-driven exploitation

During the Bill’s Second Reading, Peers helpfully commented upon its relative complexity. Baroness Lane-Fox of Soho spoke about the need for data subjects to comprehend the new rights the Bill will afford them, if they are to be meaningfully exercised in future. Meanwhile, Lord Stevenson of Balmacara pointed out that in addition to questions appertaining to age in respect of protections for vulnerable children, there is no consideration of ‘capacity’ in the Bill as currently drafted.

There is an issue surrounding the lack of a suitable right of redress for data subjects, provided for in Article 80 of the General Data Protection Regulation, which a growing number of organisations have raised with Members of the House of Commons and Lords. Like them, FCC regards provisions for qualified non-profit organisations to pursue data protection infringements independent of any action pursued by individual data subjects as essential, if the Bill’s provisions are to build public trust.

The Bill should do more to protect vulnerable adults who might lack the capacity to provide explicit consent for their data to be collected and/or processed by virtue of a learning disability or cognitive impairment such as dementia. The SafeSurfing project revealed that most people with a learning disability are unaware of the dangers they face when sharing their personal information online, whilst others report difficulties in understanding terms and conditions and fully comprehending the services they were subscribing to (Mencap). Here, the Bill should have regard to the Mental Capacity Act (2005), but it could go further to promote inclusivity and established accessibility standards. We believe the Bill could also usefully include measures to help address the challenges that carers increasingly face in liaising with service providers on behalf of those they care for.

B. Tackling data-driven discrimination

In our report Intelligent Sharing: unleashing the potential of health and care data in the UK to transform outcomes (2017), we acknowledged that governments around the world are endeavouring to strike a balance between individual privacy rights and protections on the one hand, and organisational permissions to facilitate the creation of social, economic and environmental value from broad-ranging data on the other. Data rights are now of critical importance courtesy of technological advancements.

The tension between the two is particularly evident where health and care data is concerned. Individuals are broadly content with anonymised data from their medical records being used for public benefit but are, understandably, anxious about the implications of the most intimate aspects of their lives being hacked or, else, shared without their knowledge or consent. This follows high profile data breaches and privacy incursions as per the recent case of the Royal Free NHS Foundation Trust and Google DeepMind, but it centres upon concerns surrounding the potential for commercial exploitation or discrimination by, for example, pharmaceutical, marketing and insurance companies. As such, we recommended the Government consider whether measures to tackle specific forms of data-driven discrimination could further safeguard individuals and build public trust.

Related to this, in our report Securing the Future: planning health and care for every generation (2017), technological advancements that are expected to transform services and improve health and care outcomes is the focus of Dr Bertie Müller’s expert contribution. His assessment of developments as broad-ranging as artificial intelligence, robot surgeons, nano-implants and automated vehicles suggests that technology harbours significant disruptive potential. In some instances, the technological advancements he highlights are to be welcomed – in particular, for the improvements in self- and community-based care they are expected to help realise. We are, nonetheless, encouraged to keep in mind a ‘darker side’ to developments and to prioritise planning ahead, such that the general public and ethical frameworks keep pace with them. There is, otherwise, the potential for health and care to become characterised by surveillance, he suggests, which could rapidly take us from a social contract that is premised upon tax and entitlements to one that is machine-tested (and verified) in future.

The House of Lords Delegated Powers and Regulatory Reform Committee draws attention to an issue surrounding Clause 9(6) – Power to add, vary or omit conditions or safeguards applying to the processing of sensitive personal data – in its 6th Report of Session 2017-19: Data Protection Bill (published 24/10/17). Like the Committee, FCC takes the view that ‘the memorandum does not adequately justify the breadth of the power in clause 9(6) of the Bill’, and recommends its removal from the Bill in the interests of safeguarding individuals from data-driven discrimination. FCC also supports the amendment to P114, line 16, proposed by Lord Stevenson of Balmacara.

There are further issues surrounding the safeguards attaching to automated decision-making in the Bill as drafted which, in the context of health and care data, surely resonate with the idea of ‘no decision about me without me’. Specifically, we believe they are insufficient if the intention is to guard against ‘machine-testing’ access rights to health and care products and services in future. FCC is supportive of calls for further concrete safeguards authorised by law such as full information about the logic involved and likely consequences of the decision (cf. Article 13(2)(f) of GDPR). We would also welcome provisions to ensure that the processing of personal data by automated or structured processing is never the sole means of determining a person’s eligibility for health or social care services.

2. Expanding the opportunities for data subjects to contribute health and care data to pertinent data sharing initiatives on a philanthropic basis

In our report Intelligent Sharing – unleashing the potential of health and care data in the UK to transform outcomes (2017), we made a number of recommendations appertaining to the promotion of ‘data philanthropy’, in recognition of the tangible benefits that could flow from a growth in trusted health and care data processing by a range of organisations.

Our recommendations included:

  • expanding the opportunity for data subjects to contribute health and care data to records and other data sharing initiatives;
  • establishing a new National Health and Care Data Donor Bank, to coordinate data from the public and help improve the alignment of research to clinical need; and
  • exploring the development of a ‘gift-aid’ style scheme for health and care data, encouraging individuals to make health and care data donations to better enable research and innovation.

During the Bill’s Second Reading, Baroness Neville-Jones helpfully asked the Government to “think about the possibility that they should allow for the creation of governance and accountability regimes that will fit special circumstances” – adding that “the existence of the Information Commissioner should not result just in enforcing the law effectively and well; it should provide an opportunity for creativity under her auspices and the ability to create variations on governance regimes where they are needed”. FCC supports this sentiment and provisions that would render feasible the creation of what we referred to in our report as ‘data cooperatives’, ‘data communities’ and ‘data collaboratives’ for health and care, underpinned by ‘data philanthropy’ and creative approaches to lawful consent.

There is a further issue, here, surrounding the scope to facilitate ‘data philanthropy’ through the introduction of a ‘gift aid’ style scheme, such that FCC would be supportive of additional measures to supplement data portability provisions in the Bill. Specifically, we would welcome measures enabling individual data subjects to donate the data they consent to provide to a service provider onwards – to circumscribed third parties – in the interests of furthering research and innovation in health and care. At the very least, FCC believes the ICO should be required to investigate the circumstances in which it may be appropriate to invite the giving of explicit consent to the processing and pooling of personal data for the purposes of health or social care.

3. Protecting data of national significance in the public interest and for public benefit

The UK will require an adequacy rating from the EU in order to transfer data between EU member states, EEA member countries and selected territories post-Brexit. The adequacy rating will be critical, for example, if the UK is to retain easy access to the 24 European Reference Networks for rare diseases and the EU’s European Medicines Agency (EMA) network which covers more than 500 million people. This issue of adequacy has been the subject of extensive discussion by both MPs and Peers. However, the UK also faces the prospect of having to forge further data sharing agreements with non-EEA countries. This includes the US, where President Trump signed an Executive Order ‘Enhancing Public Safety in the Interior of the United States’ in January 2017. The Order has called into question the efficacy of the current EU-US privacy shield agreement because EU nationals may no longer be extended the benefits of the US Privacy Act and have access to US courts for data protection.

A dedicated health and care data privacy shield could serve to bolster public trust in the UK. Hence, in our report Intelligent Sharing – unleashing the potential of health and care data in the UK to transform outcomes (2017), we recommended:

“The new Chief Data Officer and National Data Guardian should be tasked by Government with contributing to the development of a strengthened and/or dedicated ‘data privacy shield’ for health and care data, applicable to any future trade negotiations outside Europe, to safeguard the public whilst improving the UK’s competitiveness”.

The House of Lords Delegated Powers and Regulatory Reform Committee draws attention to an issue surrounding Clause 17 – Power to make provision in respect of transfers of personal data to third countries and international organisations – in its 6th Report of Session 2017-19: Data Protection Bill (published 24/10/17). Like the Committee, FCC takes the view that ‘the negative procedure is not the appropriate procedure for regulations specifying that the transfer of data to third countries or international organisations is to be regarded as necessary in the public interest’ – save where time-critical bio-hazardous and/or pandemic situations liable to impact public health are concerned.

There is, nonetheless, a further issue here surrounding the scope to safeguard and, even, bolster the UK’s future competitiveness. In our report, we recommended:

“The Government should explore the scope to introduce incentives for businesses prepared to enter into Joint Ventures with a National Health and Care Data Donor Bank – the aim: to help de-risk the discovery of new treatments and technologies using its health and care data, better align research to need, and secure preferential terms for the deployment of innovations flowing from the same”.

This sentiment was subsequently echoed by Sir John Bell – author of the sector-led Life Sciences Industrial Strategy (August 2017) and recent Guardian article in which he urged the Government to secure NHS data for the benefit of the British public, and the Department of Health to retain ownership of algorithms developed using NHS data.

Specifically, he said:


“The government must act urgently to ensure that patients and UK taxpayers – not just tech companies – gain from new commercial applications of NHS data … the most significant value lies in the datasets used to train algorithms on tasks ranging from speech recognition to diagnosing diseases. As the world’s largest publicly funded health service, the NHS has one of the most comprehensive health datasets in existence. What Google’s doing in [other sectors], we’ve got an equivalent unique position in the health space … most of the value is the data. The worst thing we could do is give it away for free.”

In a similar vein, during the Bill’s Second Reading, Lord Mitchell helpfully suggested that “NHS patient data are a massive national asset that should be protected … [it] should not be sold outright in an uncontrolled form to third parties … the NHS should have equity participation in the profits generated by the application of this information.”

Lord Mitchell concluded by asking how we might protect key strategic data and, with this in mind, FCC would welcome additional provisions in the Bill to protect data of national significance – both in the public interest and for public benefit. In future, this could perhaps draw upon kindred legislation applied to ‘tangible assets’ as per those ‘listing and protective’ mechanisms and measures which guard our built heritage and environment. Alternatively, it may be prudent to explore the scope to designate and safeguard ‘datasets of special scientific interest’ so that our data advantage is maintained for public benefit long into our digital future.

These are matters for further consideration by others, but if the ‘genie is allowed out of the bottle’ at this critical juncture, we will have passed the point of no return. That is why we are recommending the ICO is required to investigate, keep under review, and as may be appropriate, produce and publish written guidance on all of these issues.

  • Proposed amendments
  • pdf
  • 217.98 KB