Currently AI is at the forefront of technological development and business progression innovation; however, there is also much discussion around whether AI will bring tangible benefits to many sectors.  

Social care is one such sector where discussions around AI are starting to take up prominence, as local authorities are looking at how it could integrate into the commissioning process.  

But there is some anxiety around the use of automation and artificial intelligence systems, particularly in consideration of the sensitivities of care, so we need to know more about how these tools could work inside social care. 

 The Current State of AI 

With AI becoming prominent in data, law, finance, and even health and social care sectors, it is advancing business optimisation for more efficiency while maintaining good practice. Businesses invest more in AI technology, from chatbots such as ChatGPT to Kanerika and other data transformation agents; however, as is with any new technology, there are concerns around how businesses will look and function once this becomes standardised. 

Worries that AI will replace jobs or that it is currently not safe enough, especially considering the UK lacks country-wide AI use regulation, are two examples of such concerns. Recent news has also highlighted how hackers have used AI to optimise their data theft strategies to the point that local authorities have been affected.  

Adequate staff training in AI tools can additionally help to ensure good cybersecurity and good practice in alignment with social care data security legislation. Individual businesses are also encouraged to implement their own AI company policies to ensure good practice, even if the UK currently lacks nationwide legislation. Local authorities can contact iESE CCoE to find out how they can optimise their cybersecurity practice and policies. 

Professional Insight

Brandon Toews, AI engineer at Securanova spoke at the 2025 Cyber Cheltenham Tech Week event about how AI is much like social media, or even cars, where at the inception of its integration into our lives showcased many risks. They have all become a normal part of life now.’ He further stated how much like cars and social media in the past, it is important for businesses and employers to approach these tools with education and awareness of the application. He continued to say that “AI ‘replacing workers’ is a misconception of what it does. AI is a tool, not a worker, and it is important that business use of AI reflects its uses.”  

If anything, the concerns around employment with AI in place are more of a concern around change, as in order to ensure AI is used in keeping with sector legislation, staff need training in AI. It is likely that AI use in social care may open up new employment opportunities in managing and ensuring safe use of AI in social care. This is especially important for maintaining the security of the vulnerable people in social care. 

AI in Social Care 

Digitisation in social care leans towards the automation of administration processes to provide a lessened paperwork burden on workers and enable more time for person-focused care. 

One example would be AI scribes, which are being applied in social care settings to help support workforces. These scribes will record conversations (such as in needs assessments between individuals and social workers or support workers) and then summarise them to provide useful information or transcribe them.  

These AI tools are also being used in health care general practice settings, which gets a lot of the attention, but for those in the social care sector, it is more important that they are being used correctly in home care and residential care settings, as it is changing working processes. With success, such as the decrease in how much admin work is required, freeing up time for staff to spend with those they care for, there is scope to further improve this technology and expand it into additional fields, such as therapies.  

AI scribing tools can have an additional practical application within the care sector, by providing a tool that can cut down on administration time while also capturing and formatting notes in real-time. Examples already exist within the marketplace that can map a conversation into pre-existing forms, which is already being utilised in the wider health and care sector. 

But while there is a practical application that can be demonstrated, limitations exist within AI technology, in particular, around consistent accuracy, simply shifting some administration time from collecting information to double and triple checking the work of AI tools. 

Current Sector Use

Examples of the current use of AI in social care has been reported in a white paper by the Institute for Ethics in AI as being: 

  • Using general-purpose generative AI such as ChatGPT, or social care-specific products to generate activity plans, care plans, or meeting notes in domiciliary care or residential care settings. The paper noted that this use is very prevalent among care workers who do not have English as their first language. It is mostly used in this light as a supporter for written tasks.  
  • Use of generative AI to check health symptoms of people who are drawing on care and support to get ideas about the conditions they may be suffering from.  
  • The use of generative AI to help with administrative tasks, including email writing or letter writing. 
  • Using generative AI-powered chatbots for mental health purposes, including support. This could be conversing with a chatbot about a particular problem an individual is facing. 

Policy & Regulation Considerations 

Although the influence AI use can have on the social care sector is clear in its benefits, there is much more concern around regulation. Professionals working within the rise of AI are known for stating that these tools are useful but do require regulation and business policies in place alongside their implementation. Lydia Wootton, Product Owner at CareCubed, stated that she has visions of the potential for AI use in care commissioning processes in the future, but AI is still in its early stages. 

 “In the future we could see some change that enables AI automated models to safely integrate into commissioning processes. The main concern is security, as we want to make sure that while it could make processes easier it also minimises risk”. 

Much like the rise of the internet, there is a great deal of excitement but also risks. Being risk-aware can help mitigate the risks that can arise from new technologies. 

Yet, although the UK government has embraced the opportunities AI brings, and with many local authorities integrating AI systems into their work, there is no current UK AI law. Therefore, there is a notable lack of official guidance on how interactions between the current regulations of social care and AI systems should be safely conducted.  

There have been a variety of initiatives to address the need for good practice comprehension and guidance on this topic. Yet, to make the most of what AI offers to those within social care, it is critical that official guidance on the responsible use of AI in these environments is appropriately managed. 

 

CareCubed for Streamlining Processes 

At CareCubed, we know that having a streamlined commissioning process is the focus of many local authorities. Simple processes that prioritise value for money and have a needs-focused approach are critical to having an effective care system. While AI tools may help with their administrative processes, CareCubed is there to help provide transparency and aid in relationship building between LA’s and providers.  

Find out more about how adding CareCubed to your commissioning roster will help you to optimise your processes by contacting us today.

Other Articles

See More Blogs