Providers of digital technology for mental health care should adopt ethical framework, report argues

Report finds lack of clarity and accountability in digital mental health technologies market

3rd November 2022 about a 3 minute read
“While we recognise that digital mental health services can increase choice and reach, they are not a panacea and cannot replace the localised, personal touch that is core to our service.” Dr Cath Biddle, head of digital at Mind

There is a “culture of distrust” surrounding the use of digital mental health technologies (DMHTs), a new report has found.

The report, Trustworthy Assurance of Digital Mental Healthcare, says that a growing number of organisations are “turning to digital technologies to increase their capacity and try to meet the growing need for mental health services.” It argues, however, that “clearer assurance for how ethical principles have been considered and implemented in the design, development, and deployment of DMHTs is necessary to help build a more trustworthy and responsible ecosystem.”

In order to meet that need, the authors, based at the Alan Turing Institute, have created a proposal for a methodology called Trustworthy Assurance. To support the development of Trustworthy Assurance, they carried out stakeholder engagement events with students, university administrators, regulators and policy-makers, developers, researchers, and users of DMHTs.

The engagement events revealed that the current digital mental health landscape is characterised by “significant uncertainty, a lack of transparency or accountability, and a rising demand that outpaces trusted services and resources.” Concerns expressed by developers included a lack of clear guidance through which to present evidence of trustworthy innovation and a “lack of integration of ethics within existing workflows.”

Policy makers’ concerns focused on a lack of clarity “the lack of integration or harmonisation between existing examples of legislation and standards in this space.”

Users of digital tools were concerned about “the lack of clear and meaningful consent procedures and the insufficiency of data privacy policies” as well as “the perceived erosion of in-person care by digital technologies and services”.

Identify opportunities to involve users in digital mental health projects

The report makes five main recommendations:

  • Organisations involved in the design, development, and deployment of DMHTs should adopt and use trustworthy assurance methodology to demonstrate how they have embedded core ethical principles.
  • Standards can be co-developed within and among organisations by sharing best practices related to trustworthy assurance.
  • Common capacities should be developed across the digital mental healthcare landscape, such as initiatives aimed at improving data and digital literacy, in order to foster responsible innovation through shared best practice.
  • Research should be undertaken to identify how organisations and product managers could ease the time burden on developers through embedding and integrating the trustworthy assurance methodology into key stages of the project lifecycle.
  • Organisations involved in the design, development or deployment of DMHTs should identify opportunities and processes to support the participation of affected users within the project lifecycle.

In a foreword to the report, Dr Cath Biddle, the head of digital at Mind, noted the rapid growth in the market for digital mental health technology and said: “While we recognise that digital mental health services can increase choice and reach, they are not a panacea and cannot replace the localised, personal touch that is core to our service.”

She added:  “In a fast moving and increasingly crowded marketplace, there is an urgent need to help those purchasing or procuring digital mental health services, for themselves or on behalf of others, to be confident that the products they are buying are not only safe and clinically effective, but also promote key ethical values, such as data protection, health equity, and sustainability.”

FCC Insight

The market for digital mental health tools has expanded rapidly in the past three years. Many mental health care tools are not regulated, however, because they don’t meet the definition of “medical device”. For users, the range of options available, with little in the way of quality control, can make it very hard to know which tools are safe and effective.  The framework proposed in this report from the Alan Turing Institute does an important job of addressing concerns about scrutiny and transparency, and we hope it will be widely adopted.