Hosting
Monday, February 24, 2025
Google search engine
HomeArtificial Intelligence'An existential threat': anger over UK government plans to allow AI companies...

‘An existential threat’: anger over UK government plans to allow AI companies to scrape content | Artificial Intelligence (AI)


Ministers are facing a major backlash over plans that would allow companies to grab content from publishers and artists, amid claims the government risks “caving in” to the tech giants.

The BBC is among organizations opposing a plan that would allow tech companies to train artificial intelligence models using standard online content unless publishers and other content creators specifically opt out.

In what will be one of the first major AI policy battles, a series of meetings and roundtables are being planned to address concerns. Some in Whitehall fear that publishers have not had enough of a voice in the debate so far, but any announcement will now be put on hold until after this week’s Budget.

The government is desperate for investment from tech companies in its drive for economic growth, with ministers already announcing a total investment in UK data centers of more than £25 billion since the election. However, Google warned last month that Britain risks falling behind unless it builds more data centers and lets tech companies use copyrighted work in their AI models.

Beyond ownership issues, some publishers fear an opt-out system would be impractical because they may not know when their material is being scraped – and by which company. Smaller publishers say they face an “existential threat” if their work is used to train AI models. They argue that an ‘opt-in’ system would give them more leverage to at least agree licensing terms, similar to those already signed by bigger players for AI access to their material.

The BBC said in a statement that its content cannot be used without permission to train AI models and that no agreements have been made for companies to do so. “It is critical that publishers and media companies maintain control over how their content is used when it comes to AI,” a spokesperson told the newspaper. Observer. “The onus should remain on AI developers to ask permission to use content, not on publishers to opt out.”

Justine Roberts, the founder of Mumsnet who has filed a legal complaint against OpenAI over alleged content scraping, said the system being considered by ministers is “akin to requiring homeowners to post notices on the outside of their homes asking burglars not to rob them. whereof the contents of their house are fair game.”

She added: “Some in government appear to have drunk the Kool-Aid and are buying into the heavily pushed narrative that everything must be done away with to ensure the rapid development of AI. When in fact they have to reckon with big tech companies’ rapacious appetite for dominance and dollars, and what gets broken along the way.”

Owen Meredith, chief executive of the News Media Association, said an opt-out system would be “a blow to the creative industries, which have been a growth engine for the UK economy for a decade or more”.

Chris Dicker, chief executive of the Independent Publishers Alliance, said: “The use of anything ever posted online without express permission is a direct threat to privacy. An opt-out approach is not enough. The government must step in and enforce strict safeguards before it is too late, and not give in to the lobbying of big tech.”

However, some in Whitehall argue that an opt-out system is the one adopted in the EU through the AI ​​Act, adding that Britain could perhaps learn from the way that is performing.

The row is a sign of the fundamental changes taking place following the advent of AI chatbots. Publishers were previously willing to give up access to their material to search engines because they received readers and viewers in return. But chatbot users can get all the information they need without ever seeing the original publisher’s work.

skip the newsletter promotion

Last week, Radiohead’s Thom Yorke, Abba’s Björn Ulvaeus and actor Julianne Moore were among 10,500 signatories to a creative industries statement warning that unlicensed use of their work by AI companies posed a “major, unjust threat” to the livelihood of artists.

A government spokesperson said: “This is an area that requires thoughtful engagement, and as part of that we are committed to listening to a wide range of views to help inform our approach.

“We continue to work closely with a range of stakeholders, including holding recent roundtables with AI developers and creative industry representatives, and will set out our next steps as soon as possible.”



Source link

RELATED ARTICLES
- Advertisment -
Google search engine

Most Popular