AI Act Hits the Real World – Creative Minds Hold Their Breath
As the new AI Act makes its grand entrance this weekend, the creative community is holding its collective breath. They’re hoping the fresh legislation will finally put a stop to the wild and wacky ways AI folks have been training on their copyrighted content.
What’s the Big Deal?
Picture this: an AI system whacking through millions of songs, movies, and paintings to learn a few tricks and then, boom! It spits out a brand‑new track or a masterpiece that looks eerily similar to a famed artist’s work. The creative folks are up in arms.
The Roadblocks Still Standing
- Copyright & Consent Jumbles: The law isn’t crystal clear yet on what counts as “fair use” when it comes to training data.
- Informed Consent Snafu: Companies aren’t always sure if the content they use has the green light from rights holders.
- Licensing Labyrinth: Securing solid licenses for all that content is a maze that’s not exactly beginner-friendly.
- Enforcement Woes: Even if the rules are set, keeping tabs on every AI model worldwide is a tall order.
Why It Matters to Creators
When AI copies a slice of a copyrighted work, the original creator gets less recognition and potentially less revenue. That’s not just a legal issue—it’s a compassion issue. Everyone wants their creativity to be respected and rewarded.
Bottom Line: A Call for Clear Rules
While the AI Act is a good step forward, the creative circle still craves bullet‑proof protections. They’re pushing for laws that give them a voice, a say, and a fair paycheck when their art is used in the training of AI.

Artists Raising the Alarm: The AI Act Still Leaves a Few Ticks Unchecked
What the AI Act is Trying to Do
With the European Artificial Intelligence Act finally kicking off, lawmakers aimed for a sweeping safety net that could stand against a world of AI that’s growing faster than you can say “algorithm.”
Why the Creative Community Is Still Frowning
- Loopholes everywhere – musicians, writers, filmmakers, and visual artists feel they’re walking through a minefield.
- No opt‑out or pay‑out guarantees – the act doesn’t make it easy for creators to say “No thanks” or to get a paycheck when their work gets fed into AI training models.
- Opaque use of art – AI models that learn from music, books, and movies can do so without clear transparency or consent. That’s a recipe for creative infringement.
Voices from the Front Lines
Marc du Moulin, the Secretary‑General of the European Composer and Songwriter Alliance (ECSA), sums it up: “The work of our members should not be used without transparency, consent, and remuneration. We see that the implementation of the AI Act does not give us that.”
What Happens If the Act Doesn’t Get Fixed?
Without a solid framework, artists risk having their creative output turned into training data for generative AI without ever seeing a royalty check. The art world might become a playground for AI, but at what cost to the creators who built the playground in the first place?
In Plain English
Simply put, the current AI Act is a promising start, but it still misses a few targets. Artists demand clearer rules that protect their art, let them opt out when they want to, and ensure they’re paid when their work is used to train robots. Until those gaps are closed, the creative community will keep pushing for a more concrete solution.
‘Putting the cart before the horse’
EU’s AI Act: A New Playbook for Safer, Fairer, and More “Vanilla” Tech
The European Commission has drafted the AI Act to keep artificial intelligence safe, transparent, traceable, non‑discriminatory, and environmentally friendly. Think of it as the EU’s way of putting a stern hand on the wild rabbit of rapid tech growth.
Risk Levels: From “Practice‑worthy” to “Wait, That’s a Bad Idea!”
- Minimal risk – Most chatbots and image generators fall here. But even if you’re in the low‑risk lane, you still need to publish a quick rundown of the copyrighted data you used to train your AI.
- Limited risk – A step up, needing more stringent safeguards.
- High risk – Technologies that could influence people’s decisions or safety (think election bots).
- Unacceptable risk – Already banned. Examples: manipulative AIs or those that do social scoring—ranking people by their behavior or economic status.
Why the “Minimal” Category Isn’t a Pass‑No‑Question Pass
Even if your AI is deemed minimal risk, you’re not completely off the hook. The EU still wants you to maintain a “copyright policy”–a promise that you’ll respect creators’ rights and have a safe space for complaints. If your tech loves the public domain, that’s great—just make sure those creators can opt‑out.
The Copyright Conundrum
Under EU law, companies can harvest text and data for AI training unless a creator has “reserved their rights.” But how can an artist actually say, “I’m not giving my art to the AI crowd?” Du Moulin, an expert on the matter, says the process is murky:
“This whole conversation is putting the cart before the horse. You don’t know how to opt out, but your work is already being used.”
So if you’re an artist concerned about your works ending up in a GPT‑trained dataset, you might feel a bit “legally blindsided.”
Voluntary Code of Practice: A Gentle Nugget of Self‑Governance
The EU introduced the AI Code of Practice for General‑Purpose AI (GPAI), a voluntary yet pretty heavy‑handed agreement. Its key points:
- Commit to a well‑drafted copyright policy.
- Install safeguards to avoid rights violations.
- Set up a dedicated area to receive and process complaints.
Who’s Signing Up?
So far, the signatories include tech giants like Amazon, Google, Microsoft, and even OpenAI. It turns out that even the biggest players find it cheaper to cooperate than to fight the regulator every time a new AI comes online.
Related
Is Europe ready to police AI? Supervision and sanctions are on the horizon.
AI providers have to respect copyright laws, the Commission says
How the AI Act’s New Transparency Rules Leave Artists Feeling Left Behind
Under the AI Act, there’s now a “who‑scraped‑my‑work” tracker that promises to give artists a better idea of where their art ends up. But for those of us who’re actually hoping for a slice of royalties, the reality hits harder than a stale baguette.
What the Act Brings to the Table
“With these new transparency requirements, artists can finally see whose turn it was to shout ‘mine!’”,du Moulin says. He concedes, however, that the Act is a future‑only safeguard. In plain English: whatever got captured before the law took effect is still free for anyone to feed into an AI model.
Artists’ Struggle to Get Paid
- Adriana Moscono – The general manager at GESAC – tried to get a license by sending letters and emails to the big names in AI. The outcome? A polite shrug, or worse, a complete silence.
- “There was no answer,” Moscono told Euronews Next. “The big guns not only shut the door on copyright recognition but also ignored our pleas for fair compensation.”
In short, many artists feel like they’ve been given a free lunch that’s hard to crunch.
Response from the European Commission
- Thomas Regnier, spokesperson for the Commission, weighed in. He noted that AI firms must respect rights holders when they pull data for training. If an infringement happened, it can be settled privately.
- Regnier added, “The AI Act does not override existing EU copyright laws.”
Why the Rulebook Isn’t Enough
Despite the transparency push, artists find themselves at the mercy of AI companies that claim the market is still a “ground zero.” By avoiding the back‑door licensing route, they’re forced into the grim reality that many AI models have a history of using art without asking.
Looking Ahead
As the AI Act rolls out, artists remain hopeful that the future can bring retrievable rights and actual payments. But for now, the transparency tools might be a fancy badge rather than a tangible benefit.
Mandate licence negotiations, groups ask
When Robots Take the Stage: A Copyright Showdown
Picture this: a bunch of music‑law heavy‑weights—Du Moulin and Ms. Moscono—stamp their boots on the Commission’s doorstep, demanding a clearer play‑by‑play on how artists can opt out of the AI frenzy and get the copyright protection they deserve.
“We’re Not Getting a Ticket to the AI‑Act Fair Anyway!”
“The code of practice, the template and the guidelines, they do’t give us even a map,” Moscono told reporters, sounding as if the AI dance floor is bumping against a wall that never lets anyone see the exit. “They don’t guarantee a proper application of the AI Act,” she added with a throw‑away side‑wink.
Collective Licenses: A New Strategy?
- Option A: The Commission could obligate AI firms to negotiate blanket licenses with a coalition of artists.
- Option B: They might require “collective” agreements—imagine a concert where every guitarist signs on at once.
GEMA Goes Full‑Berserker on OpenAI and Suno AI
While the music‑rights giant GEMA’s two lawsuits against OpenAI (the squad behind ChatGPT) and Suno AI (the app that cooks up tunes out of thin air) are technically “outside the AI Act,” Du Moulin insists the verdict will set a precedent for how hard AI companies can be pushed by copyright laws.
EU’s Two-Month Deadline Clock
New AI outfits have a ticking clock: by 2026 they must be compliant with the AI Act’s regulations (14–2026). Already active in the EU? They just have a year extra, so until 2027.
But What About the Text & Data Mining Clause?
The Commission and the EU’s high court—the European Court of Justice—have already hinted that they’re going to take a fresh look at the 2019 copyright law’s text and data mining exemption. This could either loosen or tighten the rules for AI nestled in the data‑driven ecosystem.
Take‑away: Artists, the Commission, and AI companies are in a tug‑of‑war—only the legal lasso will decide who holds the winner’s trophy.