Martin Coulter
LONDON (Reuters) – Britain’s new Labour government said it would look at how to effectively regulate artificial intelligence models, but stopped short of proposing any specific legislation.
King Charles opened the new Parliament on Wednesday by announcing new Chancellor-elect Keir Starmer’s legislative agenda, which includes more than 35 new bills covering everything from housing to cybersecurity.
The government said it would seek to enact appropriate legislation to impose requirements on those working on developing the “most powerful artificial intelligence models”.
The country’s former chancellor, Rishi Sunak, hosted a summit of world leaders and business executives at Bletchley Park in November to discuss the issue, as he sought to position the UK as a global leader in AI safety.
He also oversaw the launch of the world’s first AI Safety Institute, which focuses on the capabilities of “cutting-edge” AI models like those behind OpenAI’s highly successful ChatGPT chatbot.
“There will be a collective sigh of relief from AI labs that the government will not rush into regulating frontier models,” said Nathan Benaich, founding partner at AI-focused investment group Air Street Capital.
Under Sunak, the government has avoided introducing targeted AI regulation, opting instead to spread responsibility for scrutinizing the technology across a range of regulators.
Starmer has promised to introduce new laws on AI but the government has been cautious about implementing them.
“The UK’s measured and sectoral approach to AI regulation is a key competitive advantage vis-à-vis the EU and any move to change this regime should be approached with the utmost caution,” Benaich said.
But some AI experts say the rapid adoption of AI tools over the past 18 months makes the need for new laws even more urgent.
Gaia Marcus, director of the Ada Lovelace Institute, said the government should introduce the bill as soon as possible.
“These systems are already embedded in our daily lives, public services and economies, bringing benefits and opportunities but also posing a range of risks to people and societies,” she said.
(Reporting by Martin Coulter; Editing by Jane Merriman)