This is the second guest blog from Lord Clement-Jones on data.
I’m an enthusiast for the adoption of new technology in healthcare but it is concerning when a body such as Axrem, which represents a number of health tech companies, has said that while there is much interest in pilots and proof of concept projects, the broad adoption of AI is still problematic for many providers for reasons that include the fact that “some early healthcare AI projects have failed to manage patient data effectively, leading to scepticism and concern among professionals and the public.”
I share this concern – especially when we know that some big tech and big pharma companies seem to have a special relationship with the DHSC (Department for Health and Social Care) – and in the light of the fact that one of the new 10 government priorities is:-
“Championing free and fair digital trade: As an independent nation with a thriving digital economy, the UK will lead the way in a new age of digital trade. We will ensure our trade deals include cutting-edge digital provisions, as we did with Japan, and forge new digital partnerships and investment opportunities across the globe”
The question is what guarantee do we have that our health data will be used in an ethical manner, assigned its true value and used for the benefit of UK healthcare?
Back in April 2018 in our House of Lords AI Select Committee Report, ‘AI in the UK: Ready Willing and Able?’ we identified the issue:
This received the bland government response:
“We will continue to work with ICO, NDG, regulatory bodies, the wider NHS and partners to ensure that appropriate regulatory frameworks, codes of conduct and guidance are available.”
Since then, of course, we have had a whole series of documents designed to reassure on NHS data governance:
But all lack assurance on the mechanisms for oversight and compliance.
Then in July last year the CDEI (Centre for Data Ethics and Innovation) published “Addressing trust in public sector data use” which gives the game away. They said:
“Efforts to address the issue of public trust directly will have only limited success if they rely on the well-trodden path of developing high-level governance principles and extolling the benefits of successful initiatives.
“While principles and promotion of the societal benefits are necessary, a trusted and trustworthy approach needs to be built on stronger foundations. Indeed, even in terms of communication there is a wider challenge around reflecting public acceptability and highlighting the potential value of data sharing in specific contexts”
So the key question is, what is actually happening in practice?
We debated this during the passage of both the Trade Bill and the Medicines and Medical Devices Bill and the results were not reassuring. In both bills we tried to safeguard state control of policy-making and the use of publicly funded health and care data as a significant national asset.
As regards the Japan/UK Trade Agreement for example the Government Minister said – when pressed – at Report Stage it “removes unjustified barriers to data flows to ensure UK companies can access the Japanese market and provide digital services. It does this by limiting the ability for governments to put in place unjustified rules that prevent data from flowing and create barriers to trade.”
But as Lord Freyberg rightly said at the time, there is widespread recognition that the NHS uniquely controls nationwide longitudinal healthcare data, which has the potential to generate clinical, social and economic development as well as commercial value. He argued that the Government should take steps to protect and harness the value of that data and, in the context of the Trade Bill, ensure that the public can be satisfied that that value will be safeguarded and, where appropriate, ring-fenced and reinvested in the UK’s health and care system.
On a Medicines Bill debate in January, Lord Bethell employed an extraordinarily circular argument:
“It is important to highlight that we could only disclose information under this power where disclosure is required in order to give effect to an international agreement or arrangement concerning the regulation of human medicines, medical devices or veterinary medicines. In that regard, the clause already allows disclosure only for a particular purpose. As international co-operation in this area is important and a good, even necessary, thing, such agreements or arrangements would be in the public interest by default.”
So, it is clear we still do not have adequate provisions regarding the exploitation internationally of health data, which according to a report by EY, could be around £10 billion a year in the benefit delivered.
We were promised the arrival of a National Health and Care Data Strategy last autumn. In the meantime, trade agreements are made, Medicine Bills are passed, and we have little transparency about what is happening as regards NHS data – especially in terms of contracts with companies like Palantir and Amazon.
The Government is seeking to champion the free flow of data almost as an ideology. This is clear from the replies we received during the Trade and Medicines and Medical Devices Bills and indeed a recent statement by John Whittingdale, the Minister for Media and Data. He talks about the:
“…UK’s new, bold approach to international data transfers”,
“Our international strategy will also explore ways in which we can use data as a strategic asset in the global arena and improve data sharing and innovation between our international partners.”
“Our objective is for personal data to flow as freely and as safely as possible around the world, while maintaining high standards of data protection.”
What do I prescribe?
At the time when these issues were being debated, I received an excellent briefing from Future Care Capital which proposed that “Any proceeds from data collaborations that the Government agrees to, integral to any ‘replacement’ or ‘new’ trade deals, should be ring-fenced for reinvestment in the health and care system, pursuant with FCC’s long-standing call to establish a Sovereign Health Fund.”
This is an extremely attractive concept. Retaining control over our publicly generated data, particularly health data, for planning, research and innovation is vital if the UK is to maintain its position as a leading life science economy and innovator.
Furthermore, with a new National Data Strategy in the offing there is now the opportunity for the government to maximize the opportunities afforded through the collection of data and position the UK as leader in data capability and data protection.
We can do this and restore credibility and trust through guaranteeing greater transparency of how patient data is handled, where it is stored and with whom and what it is being used for, especially through vehicles such as data trusts and social data foundations.
As the Understanding Patient Data and Ada Lovelace report ‘Foundations of Fairness’ published in March 2020 said:
“Public accountability, good governance and transparency are critical to maintain public confidence. People care about NHS data and should be able to find out how it is used. Decisions about third party access to NHS data should go through a transparent process and be subject to external oversight.”
This needs to go together with ensuring:
As the report “NHS Data: Maximising its impact on the health and wealth of the United Kingdom” last February from Imperial College’s Institute of Health Innovation said:
“Proving that NHS and other health data are being used to benefit the wider public is critical to retaining trust in this endeavour.”
At the moment that trust is being lost.
Lord Clement-Jones was made CBE for political services in 1988 and a life peer in 1998. He is the Liberal Democrat House of Lords spokesperson for Digital (2017-), previously spokesperson on the Creative Industries (2015-17). He is the former Chair of the House of Lords Select Committee on Artificial Intelligence which sat from 2017 to 2018 and Co-Chairs the All-Party Parliamentary Group on AI. Tim is a founding member of the OECD Parliamentary Group on AI and a member of the Council of Europe’s Ad-hoc Committee on AI (CAHAI). He is a former member of the House of Lords Select Committees on Communications and the Built Environment. Currently, he is a member of the House of Lords Select Committee on Risk Assessment and Risk Planning. He is a Consultant of global law firm DLA Piper where previous positions held included London Managing Partner (2011-16), Head of UK Government Affairs, Chairman of its China and Middle East Desks, International Business Relations Partner and Co-Chairman of Global Government Relations. He is Chair of Ombudsman Services Limited, the not for profit, independent ombudsman service providing dispute resolution for the communications, energy, property and copyright licensing industries. He is Chair of Council of Queen Mary University of London and Chairs the Advisory Council of the Institute for Ethical AI in Education, led by Sir Anthony Seldon. He is a Senior Fellow of the Atlantic Council’s GeoTech Center which focusses on technology, altruism, geopolitics and competition. He is President of Ambitious About Autism, an autism education charity and school.