
Half of British adults would trust AI for legal contracts
A recent survey by The Legal Director has found that half of British adults would trust artificial intelligence to handle their legal matters, and a majority would use it for interpreting contracts or terms and conditions.
The research highlights notable attitudes toward the use of AI for professional services, revealing both openness to using technology and a gap in awareness regarding potential risks. According to the survey, 50% of respondents said they would rely on AI for legal decision-making, while 56% would trust it to read or interpret contracts or terms and conditions.
Interestingly, respondents indicated a higher willingness to trust AI with legal advice than with input from friends or even with matters related to health. Thirty-two per cent would consult AI over their friends for legal support, and 46% would turn to AI for health advice.
There are some demographic differences in attitudes towards AI for legal matters. The data reveals that 55% of men reported a readiness to use AI for legal guidance, compared to 47% of women, who showed more caution in this area.
Age proved to be a significant factor in willingness to embrace AI for legal purposes. Generation Z respondents were the most receptive to using AI, whereas hesitancy increased among older age groups. Among those aged 75 and above, 61% said they would not trust AI with legal advice.
Kiley Tan, a lawyer at The Legal Director, provided caution on the suitability of AI for legal matters. He stated, "Legal services can be expensive, and there's no doubt that AI seems like a clever workaround. But when it comes to law, large language models are not fit for purpose. They're not trained on verified legal content, and they don't understand legal context. The result might appear convincing, but it could be wildly inaccurate. And in law, close isn't good enough. Most contracts aren't public documents, so AI lacks access to the depth of content needed to draft them properly. Even the best systems struggle to get it right."
The survey also explored public trust in AI with tasks of a more personal nature or those with greater consequences. Results show a notable decline in trust as the tasks become increasingly personal in significance. 65% of respondents expressed that they would not trust AI to perform surgery on themselves or loved ones. Planning a wedding with AI was also met with scepticism, with 50% unwilling to let technology handle such an event.
Day-to-day responsibilities, often marketed as suitable for AI, were also met with caution. Nearly half of the respondents stated that they would not delegate tasks such as paying bills or grocery shopping to an AI system.
The 18 to 29-year-old cohort displayed the most openness to AI across different scenarios, although a significant degree of reservation remains. In this group, 43% would not trust AI for legal advice, and 39% would not rely on AI to interpret contracts. Openness to AI declined in older age brackets, with over half of those aged 65 and above expressing scepticism about trusting AI with such responsibilities.
While some individuals are enthusiastic about AI, the survey shows this group is relatively small. Only 15% of respondents said they would trust AI to carry out all the tasks listed in the survey. The vast majority prefer human involvement, particularly for responsibilities that require managing risk, exercising judgment, or understanding emotion.
Sarah Clark, Chief Revenue Officer at The Legal Director, commented, "AI is brilliant for tasks like scheduling, sorting data or speeding up admin, but when it comes to complex areas like legal advice, it's crucial not to put too much trust in technology alone. You still need human knowledge and skill to navigate the nuances. It's not just about getting an answer – it's about understanding the context, the consequences, and the details. That's where the human touch really matters."