https://www.the-cover.com/images/uploads/content-images/iStock-1957604653_1140x810.jpg

Slowly, carefully, the care sector is adopting AI


Slowly, carefully, the care sector is adopting AI

According to anecdotal evidence from the NHSX AI Lab, social care lags behind counterparts in the healthcare space when it comes to the development and adoption of artificial intelligence (AI) technology.

There are several reasons for this, says Ross Hodgson, CEO of healthcare recruiter Unity Plus. “One of the primary reasons is the complexity and variability of social care services,” he suggests. “Social care involves a wide range of services and support for individuals with diverse needs, including those related to disability, mental health, ageing and vulnerable populations. This diversity poses challenges for developing AI solutions that can effectively address the unique and complex needs of social care service users.”

There are also concerns that a reliance on AI will be at the expense of human contact, he adds, while fewer resources and funding in the sector as a whole also has an impact. The recent decision by the Department of Health and Social Care to cut investment in the NHS AI Lab from a promised £250m to just £139m, is unlikely to help.

Remote monitoring and admin

There are cases where AI is helping in the social care sector, both in the delivery of care and wider efficiencies. Kyndi is a care business that is entirely owned by Medway Council and makes use of the AI-based remote monitoring technology, Lilli, that can detect any changes in elderly patients, in either residential care settings or private houses. The system monitors activity such as movement, trips to the toilet and unusual activity, such as leaving the house in the middle of the night.

“We can build up a picture within two or three days as to what typical movements are,” says Rob Kennedy, head of sales and business development. “We then monitor it on a 24/7 basis to see if there are any spikes or changes, and if there are, we use our control centre to contact the individual who’s got the system in place. If we can't do that, then we'll escalate it to next-of-kin, family, care workers or domiciliary care visitors. If needed, we’ll go down the emergency services route.” Kyndi estimates the system saves the council around £30,000 a week, largely through not having to move people into residential or dementia care settings.

Social workers at Barnsley Council, meanwhile, are making use of Microsoft’s 365 Copilot to write up minutes from meetings or summarise case files. “It can also be used to identify risk factors and prioritise interventions,” says Carly Speechley, executive director for children services at Barnsley Council. “Barnsley Social Care has also used M365 Copilot to assist with the creation of training materials.” The council hopes this will enable care workers to spend more time with patients and reduce the amount it spends on agency staff.

"The system saves the council around £30,000 a week, largely through not having to move people into residential or dementia care settings"


Medical support and dementia

The development of generative AI, such as Copilot or ChatGPT, has also created the potential for more interactive uses. Dr Caroline Green, research fellow at the Institute for Ethics and AI, University of Oxford, gives the example of care workers using such resources to explore options around medical support, which could then be discussed with GPs or pharmacists. Another would be the potential for people with dementia to have conversations with an AI-generated telephone system, reducing the feeling of social isolation.

But there are also concerns. “With an AI tool that records someone’s home visit, our work has told us that carers wouldn’t feel comfortable to interact with people in the way they usually do,” says Green. “They feel a surveillance tool would inhibit them in building meaningful relationships.” The Institute for Ethics and AI is currently working with several organisations from across the social care space to develop a framework for the responsible use of generative AI in adult social care and hopes to progress this into some test scenarios in the near future.

Care plans

One organisation that has formed part of this project is the National Care Association, which has concerns that AI could be used to create care plans. “That is one possible application where we would need some concerted safeguards to make sure that we’re achieving personalisation and not just getting a standard care plan that doesn't accurately reflect an individual’s circumstances,” says Ian Turner, co-chair.

"They feel a surveillance tool would inhibit them in building meaningful relationships”

“The other example is safeguarding, where you could potentially put a situation to a generative AI system, and it comes out with suggestions that could be harmful. We need to make sure that we’ve always got a check and balance from a skilled person.”

The balance between technology and humans is always tricky, but perhaps more so in the social care space than other sectors. “Many day-to-day situations within social care centre on human morals, ethics and other human decisions on what outcomes and judgments are in the best interest of another person,” points out Hodgson.

“It may be more balanced to say that social care is waiting to fully embrace AI, once AI and its abilities have been fully understood by the sector and questions around the ethical nature and legalities are concluded.”


Sectors
Topics
Published on
June 25, 2024