Ethical AI for a Brighter Future in Mental Health from Kooth Global Leadership

Ethical AI for a Brighter Future in Mental Health from Kooth Global Leadership

At Kooth, we’ve spent the last two decades building digital mental health services with, and for youth to help provide early and responsive mental health support, to shift healthcare systems to a prevention-first approach to reduce the growing demand and unsustainable costs of the mental health crisis. 

The insights we’ve gained from supporting over 1 million youth have shown us that there is no one-size-fits-all approach to providing person-centric support. 

In response, we’ve pioneered and proven new digital therapeutic models around content as therapy, peer-supported communities, and single-session therapy, enabling us to meet youth where they are and provide support that matches their wants and needs. This model helps us be alongside youth as an ever-present toolkit and digital service to support good mental health. 

As the 2024 Ipsos Health Service Global Research shows, mental health has become the biggest health issue across 31 countries, with access to treatment and mental health professional staff shortages seen as major systematic challenges to addressing this global crisis, with GenZ being most impacted and concerned. 

As we look to a future where AI becomes embedded and ubiquitous in all digital services, the key question is ‘what role will AI play in delivering and scaling mental health care services globally?”. 

We believe the answer lies in a model where the future is ‘human-supported and AI-assisted’. 

The Future is Human-Supported, AI-assisted

One of the most frequent questions our practitioners are asked by young people in chat sessions is, “Are you a real-person?”.  

The humanistic benefits of someone to open up to and a human-to-human connection can’t be underestimated. This is something that psychotherapists know and measure as a ‘therapeutic alliance’ between a service user and a practitioner.

This alliance helps create a foundation of trust and a joint sense of mission, purpose, and hope in working together. 

There’s an alternative to the growing shortage of licensed clinicians that’s hiding in plain sight. There is an abundance of individuals with lived experience. Plus, professionals who bring expertise in supporting youth from careers in teaching, social work, and other adjacent professions. 

With the right support and training to become mental wellbeing coaches, paired with oversight and supervision from experienced licensed clinicians, we can build a workforce that represents the diversity of the population we seek to serve and scale up services that deliver humanistic support.

We’ve developed and applied this model at scale. For example, enabling us to help build a workforce in California to provide mental health coaching support that can scale up to deliver a population-wide service.

Within this human-supported model, we can apply AI to unlock benefits to grow access, ensure productivity and quality, and create a more responsive healthcare system. 

 

For Service Users, a Personalized Experience 

 

We have the opportunity to apply AI to enhance and deliver a personalized experience for every individual that is responsive to their unique and changing wants and needs. 

For example, offering coping strategies, tools and resources, or topics for conversation with practitioners, based on a service user’s mental health profile and stated needs. Or, gently nudging users toward healthier behaviors, self-care routines, or timely interactions to ensure they stay on track with their mental health goals.

For Practitioners, Tools to Deliver Safe, High Quality Care At Scale

 

For mental health practitioners, we see AI as a technology to support them in delivering high-quality care with high productivity. By augmenting practitioners’ workflows, professionals can focus on what they do best — providing care for service users. 

For example, identifying service users at risk by detecting signals from user-generated content such as daily journals could enable them to intervene and provide the right support in the moment. 

Or, enabling practitioners to prepare for their session with an individual by evaluating the results from clinical measures, and their previous engagement with the service to identify key areas to focus on in the session.

Or even to enable quality-at-scale by assisting in the audit and continuous improvement of practitioner practice. By applying AI to the session transcripts we can ensure that we deliver good practice at scale. Today, we use a proprietary model called iRESPOND to ensure that our practitioners are providing mental health support: 

  1. Integrative: one size does not fit all; support needs to account for a range of theoretical frameworks to cater to the varied populations it serves.

  2. Responsive: support needs to be provided in a timely way regardless of entry or touch point for youth.

  3. Evidence-informed: support needs to take into account the evidence base for a range of presenting issues regardless of threshold levels to maximize effectiveness.

  4. Safe: safety must be central to all support provided. This means that moderation, clinical audit, and ongoing training must be embedded within the service. In addition to safety planning tools within the platform, escalation pathways need to be clear and robust.

  5. Person-centric: support needs to be matched to youth wants and needs (as opposed to a deficit based set of diagnostic criteria) to ensure engagement: a bottom-up approach with youth voice central to the service development (e.g., see participation section above).

  6. Outcomes-driven: support needs to take into account intended outcomes with fit for purpose measures that align with a strengths-based and person-centered approach. For example SWAN-OM has been developed by Kooth to measure impact of single session interventions (see outcomes and efficacy section above).

  7. Non-judgemental: access to support needs to be equitable for all eligible youth regardless of cultural background, gender, belief systems and other differences and this needs to be reflected within our online community as well as through 1:1 interactions.

  8. Data-informed: all support needs to be intelligently informed by available data from a variety of sources; this includes external data regarding emerging issues as well as internal data from the platform regarding how youth are engaging with us and ‘what works’.

 

For Healthcare, Predictive and Responsive Systems 

 

Healthcare systems, payers, and providers face immense pressure to deliver effective care at scale, manage costs, and be responsive to the needs of the populations they support. Data and AI-driven systems that can help predict future demand are well established in supply-chain and eCommerce environments, yet healthcare operates at a slow clock speed in terms of responding to the shifting needs of the population. An example of this has been seen recently with the exponential growth in problems such as eating disorders, with healthcare systems unable to respond rapidly to the changing needs within the population. 

Our ambition is to help create more responsive and data-informed healthcare systems, to become the ‘canary in the coal mine’ to predict and alert healthcare systems to changing demands. 

A data-informed approach can help both upstream public health services and downstream treatment services work more closely together on strategic planning and rapid responses, creating a more data-informed healthcare system. 

By supporting all stakeholders — service users, practitioners, and healthcare systems — we not only deliver accessible and high-quality mental healthcare for all, but we can shift from reactive, episodic mental health care to proactive and personalized support building healthier, happier populations and economic prosperity.

Join us in exploring how data and AI can transform mental health care for a brighter, healthier future.

To view or add a comment, sign in

Others also viewed

Explore content categories