Skip to main content

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.

We can’t give AI everything for free

Big tech is ripping off Britain’s creators. It can’t be allowed a license to break the law

Using AI cannot become an excuse to break the law. Image: TNW

If a company’s market value could be directly translated into Gross Domestic Product (GDP), major tech firms like Google, Meta, Amazon and Apple would be sitting round the table at the G20 summit, rather than just lobbying the ministers outside. Does this mean we have to entreat with them, like sovereign states, rather than regulate them under domestic law like any other businesses? The answer is clearly no, but for some reason, governments seem increasingly inclined to believe they should, enticed by the prospect of some massive data storage warehouses or distribution sheds coming our way.

The current debate about the impact of AI on copyright laws in this country is a case in point. The UK creative industries are one of our great national success stories. From Shakespeare to Elton John, ideas, stories and music conjured in the imagination of their makers have conquered the world.
According to UK government figures, the creative industries account for over two million jobs and £120billion in economic revenue each year to this country. As a point of comparison this makes it nearly five times larger than the UK motor industry.

Copyright is the legal bond which underpins the value of the creative industries. If artists, musicians and filmmakers are unable to claim ownership of their works, then they cannot make a living from them. UK copyright law is very clear, in that it prevents people from copying, distributing, performing and adapting the work of others, and it also applies automatically.

There have always been disputes over the use of copyrighted materials, and whether one artist has copied the work of another, but the law has been the ultimate guardian of those creative rights. However, in the age of AI, this can be complicated by the harvesting of data related to protected works, without the knowledge or consent of their creator.

The information gained from such activity can be used to create and train new products and services, without any compensation back to the original source. Artists and publishers fear not just loss of revenue, but that training data from their works could be used to create generic copies which can be freely distributed, undermining future investment in the creative industries.

In parliament, this important issue has been brought to a head by the crossbench peer, Baroness Beeban Kidron, who has tabled amendments to the Data Bill, to ensure that there is real transparency from AI developers on the source of the data they are gathering. She has warned of the dangers of letting AI companies “steal copyrighted work by default unless the owners of that work ‘opt out’. But opting out is impossible to do without AI transparency.”

Unsurprisingly, the ‘opt-out’ model is the option favoured by the tech companies because they know it will be extremely difficult for creators to prove that data from their works has been extracted without their consent. The former Meta executive Nick Clegg recently explained that whilst he thought it was a “matter of natural justice” that artists “should be able to opt out of having their creativity, their products, what they’ve worked on indefinitely modelled”, he nevertheless thought that, “expecting the [AI] industry, technologically or otherwise, to pre-emptively ask before they even start training — I just don’t see. I’m afraid that just collides with the physics of the technology itself.”

However, whilst the great physicist Sir Issac Newton’s third law states that “for every action, there is an equal and opposite reaction”, the government is struggling for a response, clinging to the hope that a currently unproven technical solution might come to their rescue. The secretary of state for science and technology, Peter Kyle, has called for a “workable solution on transparency” while also blocking Baroness Kidron’s amendments to enshrine that principle in the Data Bill.

At the same time, the government has also been running a consultation on a text and data mining exception from copyright law for AI developers, with its favoured option being the ‘opt out’ model for creators that is rendered worthless without transparency.

The government, using similar language to a recent report from the Tony Blair Institute on Rebooting Copyright calls for a balanced approach between the interests of creatives and the AI industry. But at the end of the day, creators either have legal rights over their works or they don’t.
Government ministers tend not to call for a balance to be struck between the rights of tradesmen to own their tools, and the business model of the people fencing stolen goods at car boot sales who are selling them. The additional danger here is that governments seem to believe that big tech should be given a free pass to ignore laws on copyright, competition and user safety that other industries have to follow.

We all want to see new technology delivering exciting product innovations, but our guiding principle has to be that simply using AI can’t be an excuse to break the law. The trouble is that without transparency, we just won’t know for sure.

Damian Collins is a former minister for tech and the digital economy

Hello. It looks like you’re using an ad blocker that may prevent our website from working properly. To receive the best experience possible, please make sure any ad blockers are switched off, or add https://experience.tinypass.com to your trusted sites, and refresh the page.

If you have any questions or need help you can email us.