OpenAI and Google have criticised the UK government’s “preferred version” of changes to copyright law that would regulate the training of AI models using publicly available internet content.
Image source: Steve Johnson/unsplash.com
A public consultation on the government’s proposals, which closed in February, saw around 11,000 submissions from companies and users. OpenAI and Google made their case after the UK Parliament’s Science, Innovation and Technology Committee sent them a question about it, as both companies declined to give their views on the bill to MPs.
Under the government’s proposed amendments to the law, AI companies will be able to train their models on publicly available content for commercial purposes without the permission of copyright holders, unless the copyright holders wish to “preserve their rights” and refuse to do so. The amendments also impose stricter requirements for transparency in the activities of AI companies.
In its comments, OpenAI said that experience in other jurisdictions, including the EU, showed that allowing rightsholders to opt out of content would pose “significant implementation challenges,” while imposing a transparency obligation could deprioritize the market for developers. “The UK has a rare opportunity to cement its place as the European AI capital by making choices that avoid political uncertainty, foster innovation, and stimulate economic growth,” the company said.
Google, for its part, said copyright holders can already effectively enforce controls to prevent web crawlers from copying content from the internet, but suggested that those who refuse to provide content for AI training would not necessarily be eligible for a reward if it was found in the model’s training data. “We believe that learning on the open internet should be free,” the company said, adding that “excessive transparency requirements… could hinder the development of AI and impact the UK’s competitiveness in this area.”
A UK government spokesman told Politico that no final decision has been made on the matter.