latest

Developing data analytics to improve how social care is monitored and understood for those with the worst outcomes

8th April 2021 about a 7 minute read

Guest blog for the Community of Practice for Social Care Analytics by Arnie King, Equalities Lead, Manor Community. Manor Community are one of the award holders for the Strengthening Social Care Analytics programme run by the Health Foundation.

 

Manor Community is a SME care provider based in East Bristol, providing a range of services and diverse support, primarily to adults with Learning Disabilities, Autism, or both. We also support people due to mental or other complex needs, and older people too!

We are incredibly excited to be a part of The Health Foundation’s Strengthening Social Care Analytics programme, bringing our care and community experience to a project that will properly test out how advanced analytics and machine learning tools can understand or be used best to benefit people with ‘the worst outcomes’.

We have been providing care across our community for years now. We have always been very vocal about what matters to us and the amazing people around us (accessing or working in care) since we started, but lately we have created our own non-profit social care network (Coproduce Care CIC) and started projects like this one to make sure that social care gets the recognition and inclusive opportunities it deserves.

The types of analytics and tools we are testing are already being used by large institutions and organisations in delivering or monitoring a whole range of services across the country, and it is not going anywhere! Machine Learning and advanced data analytics is going to be a huge part of how data is used to integrate health and social care over the coming years, and is only going to become a bigger part of how the rest of our lives are run too. It is now our job to see how well it can understand people who are at most need of support and a fair voice, and what else is needed so that providers, carers and the community are a real part of how these advances are designed and used.

Now, like us, I hope you have read ‘worst outcomes’ a couple times here and winced a bit! We think this term is a bit like ‘hard to reach’, it is fairly vague and is left open to interpretation by anyone. So, in this blog we are going to talk through some of the conversations we have had to work out what improving the voice and power of this group means, as well as cover some of the larger data analytics issues that are going to be relevant during our pilot.

We have worked with our local council to identify ‘the worst outcomes’ and we agreed it is those with complex needs related to mental capacity, namely people with Learning Disabilities, Autism or both. Other complex or communication needs related to mental health or drug and alcohol rehabilitation are definitely relevant, but we see the main focus being those who need the most dedicated support at the same time as they continue to see worsening health inequalities and referral to hospital or primary care be over-used. Basically, outside of older people a lot of adults in need of support are given inappropriate services or struggle to voice their personal wants or concerns in a way that leads to sustainable change across services.

You might think that this has a lot to do with how care is commissioned or the very important focus on older people, where the needs and link to hospital capacity is clearer. However, we have seen huge changes in the last few years as a fuller picture of what caring for a community really means has come into public view.

Sector inspectors CQC have released large policies and priorities focussing on this younger group (‘Out of Sight – Who Cares?’), and we have seen the NHS and commissioning bodies move towards championing personalised and integrated care. This aims to prevent people, whatever their circumstances, from needing clinical interventions in the future by connecting them to appropriate and accessible lifestyles before they hit crises or lose all independence.

Yet we continue to see declining health measures, repeated isolation and restraint and at times shocking examples of support for people whose circumstances relate to mental capacity. So, what IS stopping health and care from properly understanding and supporting this group?

We are running some conversations on the FCC community of practice forums specifically on this topic of who receives ‘the worst outcomes’, so it would be great to hear what you think about our definition or how it might be different in your area.

In terms of the tools and methods themselves, this group and the reason why we see them as most important is a massive part of why our pilot is such a crucial test for how this type of technical development will affect social care in the future.

The analytics and machine learning tools we will use aim to understand a large amount of people’s answers by automatically finding themes and common links when looking through lots of data at once. So, say you wanted to use it to find out what people thought about a new car, you would ask lots of people questions split into the different topics you want to know about in the public reaction to that car. A few questions about the colour it is available in, then a few questions about how expensive it is, a few questions about whether it is sporty enough… then once you have asked lots of people these questions, you feed them all the answers into the machine and it will tell you either how positive (or negative) the answers were on average in each category, or what people said most when talking about each topic. So, the analysis might tell you that people were 80-90% positive about your new care only being available in black, or that ‘safety’, ‘fuel’ or ‘economy’ came up repeatedly when people talked about its speed, suggesting you don’t have to worry about making it go too fast. It is a very basic version of Artificial Intelligence!

Now, imagine the common theme it understands from what people said about the price was ‘an arm and a leg’? We could probably guess that that means nobody wants to buy a car they can’t afford. But now let’s imagine it is not a simple question about a new car, it is a person with communication issues being asked why the care they receive does not stop them needing hospital care regularly. You hopefully start to see why we need to know how well these systems can deal with a group that is not listened to properly already, and the humans in charge right now!

What’s more, there are reports of discrimination and personal stereotypes finding their way into the algorithms, when dealing with people who do not even need support and tend to communicate in a way the machine thinks a ‘typical’ person would. Most would argue that this is because the people who designed the algorithms for the machine tools have inadvertently put their own assumptions into the computer codes that set the basic rules and goals for the AI. Social Care not only needs to support diverse communities that see health inequalities between groups based on their diagnoses, as well as ethnicity and other social factors – but we have one of the most diverse workforces of any industry.

If machine learning and advanced analytics is going to be used to improve social care, benefit the people who carry out care, and change the way care works with the diverse communities that we need to start supporting way before they go to a hospital or go into primary care… then we need to pilot how well it can understand the groups with ‘worst outcomes’ as well as what skills, support, training or tech providers will need. At a time when social care faces even more potentially fatal workforce and capacity crises, this could become another hoop for care to jump through just to be allowed to take pressure off the NHS – or it could be a genuine innovation in how fair, preventative and connected services are in the future.

You can find out more about our project here.

And I hope to see you telling us your thoughts on the practicality or ethics of these systems on the Community of Practice Forums.