Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Warning over use in UK of unregulated AI chatbots to create social care plans

Mar 10, 2024 - theguardian.com
Researchers at the University of Oxford have raised ethical concerns over the use of unregulated AI chatbots in social care. The study found that some care providers have been using generative AI chatbots, such as ChatGPT and Bard, to create care plans, posing potential risks to patient confidentiality and quality of care. Dr Caroline Green, who led the study, warned that personal data inputted into these chatbots could be used to train the language model and potentially revealed to others. She also cautioned that carers could act on faulty or biased information from these AI-generated care plans, potentially causing harm.

Despite these concerns, Dr Green acknowledged the potential benefits of AI in easing administrative work and enabling more frequent revisiting of care plans. Other AI technologies already in use in the health and care sector include PainChek, a phone app that uses AI-trained facial recognition to identify pain, and Oxevision, a system used by NHS mental health trusts to monitor patient activity. However, care managers fear that using AI tech could lead to inadvertent rule-breaking and loss of their licence. A meeting of social care organisations has been convened to discuss responsible use of generative AI, with the aim of creating a good practice guide within six months.

Key takeaways:

  • Researchers at the University of Oxford have raised concerns about the use of unregulated AI bots in social care, citing potential risks to patient confidentiality and the possibility of carers acting on faulty or biased information.
  • Despite these concerns, the researchers also acknowledge potential benefits of AI in social care, such as reducing administrative work and allowing for more frequent revisiting of care plans.
  • Several AI-based technologies are already being used in health and care sectors, including PainChek, Oxevision, and Sentai, which use AI for tasks like identifying pain through facial recognition, monitoring patient activity, and reminding patients to take medication.
  • A meeting of 30 social care organisations was convened at Reuben College to discuss responsible use of generative AI in social care, with the intention of creating a good practice guide within six months and working with the Care Quality Commission and the Department for Health and Social Care to develop enforceable guidelines.
View Full Article

Comments (0)

Be the first to comment!