(Toronto) Kids Help Phone turns to artificial intelligence (AI) to help meet “huge need” as more young people seek mental health help and support .
“Young people are changing fast and technology is changing faster,” said Michael Cole, Senior Vice President of Technology, Innovation, Data and Chief Information Officer at Kids Help Phone.
The helpline partners with the Vector Institute in Toronto, which markets itself as a consultant to organizations, businesses and governments to develop and adopt “responsible” AI programs.
It is anticipated that the AI will be able to recognize key words and speech patterns from young people contacting Kids Help Phone to help busy counselors focus on what they need and tailor their support accordingly.
But the organization says it is well aware that the term “artificial intelligence” might alarm people who imagine a computer or chatbot, rather than a human, on the phone.
This is not how its AI program will work, assured Katherine Hay, president and CEO of the organization. “It’s always human to human,” she said. This does not replace a human-to-human approach. »
Instead, the information gathered by the AI will be available to human counselors when working with the young person on the other side of the call or text exchange, she said.
The national 24/7 helpline for children and adults has seen a huge surge in demand for its services since the start of the COVID-19 pandemic. After receiving around 1.9 million calls, texts, live chats or website visits in 2019, Kids Help Phone has seen that number climb to more than 15 million since 2020, according to figures provided by the organization.
The organization already uses artificial intelligence technology to help sort texts, Hay said.
For example, if someone uses trigger words or phrases such as “I’m feeling hopeless, I think I want to die”, or something like that, it will put that conversation at the top of the queue (to speak with a counselor ), Ms. Hay explained.
Roxana Sultan, Vector’s chief data officer and vice president of its health division, said treating AI as a tool, not a substitute for humans, is an essential part of using technology responsibly in Healthcare.
“We have been very clear with all of our partners that the tools we develop are always intended to be a support for clinicians. They are never meant to replace clinician judgment, clinician engagement,” she said.
Kids Help Phone’s artificial intelligence tool will use “natural language processing” to identify “keywords or trigger words that correlate with specific problem types,” she said.
“If a young person uses a specific word in their communication that relates to a specific issue or concern, they will be flagged by this template and they will alert professional staff. »
For example, AI can be trained to recognize words that suggest a possible eating disorder, allowing a counselor to steer the conversation in that direction and offer specific resources and supports.
AI can also be trained to identify new words and trends related to situations that cause distress and anxiety, such as a pandemic, climate change, wildfires or a mass shooting.
“It’s really meant to augment the services that professional staff provide,” Sultan said. (It) helps them to be more efficient and effective in how they then deal with issues that arise during the conversation. »
The key, Sultan said, is to ensure that AI tools are thoroughly tested by clinicians before launching them. Kids Help Phone and Vector plan to launch the new technology in 2024.
Another concern people may have about AI is the privacy of their personal information, she said.
“It’s really important to be clear that all data used to train the models is anonymized. »