latest
New strategy will aim to make UK a global AI centre
“An AI model in isolation is not good or bad - it is human intent and misuse which is the issue and this is not being communicated well enough”. Dr Peter Bloomfield, Future Care Capital Head of Policy and Research
The AI Council, supported by The Alan Turing Institute, is seeking views to help shape a National AI Strategy to be published later this year.
They want to hear from anyone involved in developing and using AI technologies.
Results will help inform the new strategy which aims to help make the UK a global centre for the development, commercialisation and adoption of responsible AI.
The AI strategy will focus on:
Future Care Capital is among those taking part in the survey and its Head of Policy and Research, Dr Peter Bloomfield, welcomed the initiative.
He said that while data was becoming increasingly available, the development of tools and products needed better leadership and a more focused approach to avoid bottlenecks to innovation in AI.
“AI for prediction or automation versus using data more broadly to inform decisions is poorly understood across health and care. The difference needs to be better articulated to ensure uptake.”
On the issue of investment, he said that while there was plenty of funding for AI research, it is often the perceived “high impact” research that grabs headlines and is typically funded as a priority.
“This is not always the research that needs to be conducted or that provides the best outcome.
“Funding for validity testing, and to better understand the effectiveness and robustness of systems, seems to be neglected, yet is some of the most needed work”.
Future Care Capital’s response to the survey acknowledged that the Government is not able to directly compete with some of the big tech players. But it can provide incentives for those companies to develop larger R&D bases in the UK and share talent with other UK organisations.
Dr Bloomfield added: “Many of the best data scientists I have worked with also like to develop their own startup or work in a very early stage startup. Further developing this niche and improving adoption of their solutions will be helpful long-term.”
He said Future Care Capital was calling for a better appreciation of the full range of AI talent. “AI talent is not just Machine Learning skills and developing the best models.
“There is the whole business of AI and being able to effectively communicate in a responsible way what the system can do and why it is beneficial. Data scientists should not be doing all of it. But they need to be consistently involved with the wider business of AI.”
Future Care Capital is also concerned about insufficient provision of training and development in AI skills for the current UK workforce.
The response noted that: “Most of the courses and training available are very academic and are not tailored for people in the current workforce. Unless you want to retrain as a data scientist or engineer there’s very little available”
It also highlighted a disconnect between wanting to develop AI and wanting to use it.
“The NHS and NHS Trusts are a great example of this. Centres of Excellence and Trusts associated with large academic departments can develop skills, build models, test and adopt them in clinical pathways. But if you are a different sort of trust without academic ties you may well have completely different priorities for your funding and see AI skills and training as an irrelevance’.
The FCC’s response singled out the media as a major cause of public mistrust of AI. There was too much focus on ‘horror stories’ and often poor quality reporting of such stories.
Dr Bloomfield added: “An AI model in isolation is not good or bad – it is human intent and misuse which is the issue and this is not being communicated well enough”.
The survey is available here. It closes on June 20.
An additional National AI Strategy for Health and Social Care is also being developed by NHSX. More information available here