The Centre for Data Ethics and Innovation (CDEI) recently released its AI (artificial intelligence) Barometer and our Director of Policy and Research, Annemarie Naylor, supported it’s launch. The CDEI provides a connection between industry, government and the public, to ensure adequate governance is instated for emerging data-driven technologies. It sits alongside the Office for AI and maintains an independent, impartial expert advisory role.
AI, as a field, is advancing rapidly and warrants careful consideration in the way it is technically developed as well as how it is deployed in real-world settings. There are three technical areas which have rapidly improved the performance of AI:
The AI Barometer work from the CDEI sits at this interface between technical development and real-world deployment of data-driven technology. The Barometer was developed after a series of workshops and round tables discussing the opportunities and risks with the technology. I helped contribute to these when I was working with AI startups at Digital Catapult and it is great to stay involved in my new role at Future Care Capital (FCC). These workshops were from different industry perspectives as well as citizen focus groups. FCC offered expert input to the preparation of the health and social care section of the report.
Where there is data available for analysis and insight, different forms of AI technology can be implemented for human benefit, particularly in health and social care. However, there is a real asymmetry at present between access to the health and social care data that is needed to stimulate AI development in both fields. We need government intervention to improve our social care data infrastructure and associated analytics for a care tech sector fit for purpose in the 21st century to grow.
Below is a summary of the themes from the Barometer related to health and social care:
Theme 1: Mis/disinformation – The lines between lifestyle products and medical devices are slightly blurry at times. The distinction between insight and intervention is not always immediately clear and for elderly or vulnerable groups this is a real concern. When AI is being implemented in a health or social care product, accuracy and clear messaging are crucial and verified accurate information must be a priority.
Theme 2: Bias in algorithmic decision making – If the data which a technology relies on is not representative of the users, the users won’t have a good experience, or will not think the products they use are accurate to them. Crucially, if data is not representative of the people using it, this could lead to extremely detrimental decisions being made. In healthcare, poor data can lead to missed diagnosis and death, and in social care, adults and children may not be adequately safeguarded. Hence it is important to get the foundations of data right to ensure datasets and subsequent algorithms are not endangering lives.
Theme 3: Under-use in social care – Health care is ahead in development of this technology. To make accurate products and devices, there needs to be high-quality data, which often is captured by care providers or sensors. Better digital skills and infrastructure are needed in social care to enable this. This is a risk in terms of providing up to date, high-quality social care.
Many examples of healthcare AI are widely known, for example, medical scan analysis and detection of tumours or cancerous tissue. However social care applications are still emerging and are a little behind in deployment and market readiness. Below are three examples of startups making use of AI technology to improve the lives of people receiving care in a domiciliary or care home setting:
Where data is lacking, products can’t readily be developed and up to date quality of care is the price we pay for this. The current context of COVID-19 is putting pressure on the NHS and new ways of using technology to ease pressure are promising, but come with real risks and concerns. The way digital technology is built is not generally in line with how clinical standards are set out. Agile methodology and the “move fast and break things” approach, while less favored than before, is not appropriate where wellbeing and lives are at risk. The Medicines and Medical Devices Bill is currently making its way through parliament to ensure patient safety is prioritised. But there is a great deal of work needed to ensure this is reflected in practice, for health and social care.