REVEALED: How the 'Make It Fair' Campaign is MISLEADING Britain About AI
Creative industry fat cats are at it AGAIN, peddling DOOM and GLOOM about the latest technology that threatens their cosy business models. This time they're targeting artificial intelligence with a campaign built on HALF-TRUTHS and SCAREMONGERING.
The so-called 'Make It Fair' campaign, launched just days ago by music executives and publishing bigwigs, claims the government's proposed copyright changes represent an 'EXISTENTIAL THREAT' to Britain's creative industries. But is this just another case of wealthy industry dinosaurs REFUSING to adapt?
THE SAME OLD STORY
We've seen this playbook before. When Napster emerged, record labels screamed bloody murder rather than embracing digital downloads. When YouTube appeared, they predicted the END OF DAYS for professional content. Each time, the creative industries have been WRONG, eventually adapting and thriving despite their apocalyptic predictions.
Now they're at it again with artificial intelligence, claiming the sky is falling because AI might be allowed to learn from books and music that already exist – just as HUMAN artists have always done!
MISLEADING CLAIMS EXPOSED
The campaign's central claim – that the government "wants to change UK laws to favour big tech platforms so they can use British creative content without permission or payment" – is DELIBERATELY MISLEADING.
What the government has ACTUALLY proposed is a balanced approach with an opt-out system, giving creators control while preventing Britain from falling behind in the AI revolution. But you won't hear THAT from the campaign's slick propaganda!
The campaign's dramatic silent "album" titled "Is This What We Want" with tracks spelling out "The British Government Must Not Legalise Music Theft To Benefit AI Companies" is nothing short of EMOTIONAL MANIPULATION. The government isn't "legalising theft" – that's pure hyperbole designed to whip up fear.
THE INCONVENIENT TRUTH
Here's what the 'Make It Fair' campaign WON'T tell you: UK copyright law only applies IN THE UK. These proposed changes won't stop AI companies in America, China or anywhere else from training their models on whatever content they can access.
All the campaign will achieve is HANDICAPPING British AI companies while foreign competitors race ahead. Is that what we want for Global Britain?
Industry bosses like Dr. Jo Twist of the BPI claim the proposed changes would "completely undermine this growth opportunity" – but where's the EVIDENCE? It's just more baseless fear-mongering from an industry that has cried wolf countless times before.
THE REAL AGENDA
Let's be clear about what's REALLY happening here. This isn't about protecting struggling artists – it's about preserving outdated business models that benefit industry executives and shareholders.
Countries with more flexible copyright laws like America, Singapore and South Korea have seen BOOMING growth in innovation without destroying their creative industries. Meanwhile, Britain risks being left behind because of special interest groups clinging to the past.
The 'Make It Fair' campaign isn't about fairness at all – it's about FEAR of change, FEAR of competition, and FEAR of a future where gatekeepers lose their power.
As Britain stands at the crossroads of the AI revolution, we must ask ourselves: do we want to lead the world in innovation, or be held back by the same people who once thought the internet was just a passing fad?
Comms Manager / Science Communicator / Musician and Composer / Speaker / Scottish AI Alliance
5moThis feels like a good time to revisit the Berne and Universal Copyright Conventions to make them applicable to the modern age. I don't believe that an opt-out model will work in practicality and nor do I see why artists signing away their work should be the norm rather than the exception. I often see people argue that regulation should be drafted to encourage innovation, but what does that actually mean? What is innovation in this context? Also, the tools you mention in another comment - various plugins and pre-recorded loops - are not quite the same as what is being discussed by the Make It Fair campaign. Samples must be licensed and paid for - even when you use samples from places like Splice the creators get paid. Same with plugins like Melodyne, powerful drum machines etc - they're paid tools. The creators are compensated.
AI + creativity specialist with 30 years in content, marketing & emerging tech. Gen AI veteran. I help businesses scale with AI-first workflows & lean production, turning ideas into work that cut through.
5moI think I agree. Certainly the prior precedents such synthesisers and sampling in music, or photoshop and desk-top publishing in the graphic arts, were met with a similar moral panic - and they all turned out fine. But there is a part of me that worries, "is it the same this time?" Like, what if it isn't the same, and all economic benefits accrue to the tiny minority of tech billionaires? What then?
Delivering award winning Conversational AI | Data & AI at Accenture
5moHow would an opt-out system work in practice? & how do they implement “measures of transparency” on the training data used during model development process, without relying on self-policing?