Nick Clegg says asking artists for use permission would ‘kill’ the AI industry
35 Comments
Comments from other communities
Maybe the question is how do you sanction other malign actors who intend to steal the data. We know China and others do not give a shit about (especially western) IP rights. Not sure if that really justifies us ignoring IP rights.
Well then maybe the AI industry deserves to die.
This is true almost every time someone says "but without <obviously unethical thing>", these businesses couldn't survive! Same deal with all the spyware that's part of our daily lives now. If it's not possible for you to make a smart TV without spying on me, then cool, don't make smart TVs.
If your business model crumbles under the weight of ethics, then fuck your business model and fuck you.
There's a big difference in generative image AI, and then AI for lets say the medical industry, Deepmind etc.
And yes, you can ban the first without the other.
Going for AI as a whole makes no sense, and this politician also makes it seem like it's the same.
Saying AI is the same as just saying internet, when you want to ban a specific site.
There is a very interesting dynamic occurring, where things that didn’t used to be called AI have been rebranded as such, largely so companies can claim they’re “using AI” to make shareholders happy.
I think it behooves all of us to stop referring to things blankety as AI and specify specific technologies and companies as the problem.
If your business modell only works if you don't follow any moral or official laws...it shouldn't exist!
Unfortunately, capitalism doesn't work like that...
In other news, asking Nick Clegg before emptying out his home would kill the robbery industry.
Nick Clegg says asking artists for use permission would ‘kill’ the AI industry
I fail to see any downside to this.
I bet door-to-door salespeople would make way more money if they could just break into your homes, leave their junk on your table, and steal your credit card, and yet we don't let them do that.
I'm starting to think we need to reframe this a little. Stop referring to "artists". It's not just lone, artistic types that are getting screwed here, it's literally everyone who has content that's been exposed to the Internet. Artists, programmers, scientists, lawyers, individuals, companies... everyone. Stop framing this as "AI companies versus artists" and start talking about it as "AI companies versus intellectual property right holders", because that's what this is. The AI companies are choosing to ignore IP law because it benefits them. If anyone, in any other context, tried to use this as a legal defense they would be laughed out of the courtroom.
Speaking at an event promoting his new book, Clegg said the creative community should have the right to opt out of having their work used to train AI models.
No, it should be the opposite. The creative community should have to opt in. AI can run off the uploaded pieces. Everything else is theft.
But he claimed it wasn’t feasible to ask for consent before ingesting their work first.
What the fuck...?! Send a fucking email. If you don't get an answer, then it's a "No". Learn to take no for an answer.
Perhaps the government should collect money from the AI companies — they could call it something simple, like “taxes” — and distribute the money to anyone who had ever written something that made its way to the internet (since we can reasonably assume that everything posted online has now been sucked in to the slop machines)
I think the primary goal of LLM is to use it on social media to influence public opinion.
Notice that all companies that have social media are heavily invested in it. Also the recent fiasco with Grok taking about South African apartheid without being asked shows that such functionality is being added.
I think talking about it to replace white collar jobs is a distraction. Maybe it can some, but the "daydreaming" (such a nice word for bullshit) I think makes the technology not very useful in that direction.
What's a fucking shocking idea right? My mind is blown and I'm sure Mr. Clegg would be ecstatic when we tell him about it! /s
Greedy dumb mfkers.
But why won't anyone think of the AI shareholders...
Oh no wouldn’t that be a shame. /s
I’m sorry but if your industry requires that you commit a bunch of crimes to make money, it’s not a legitimate industry, it’s a criminal industry. We’ve had these for a long time, and generally they’re frowned upon, because the crimes are usually drugs, guns, murder, sex trafficking, or theft. When the crime is intellectual property theft, apparently we forget to care. Then again, same with wage theft.
"you would basically kill the AI industry in this country overnight.”
Cutting through to the heart of the issue here, economic FOMO. "If we don't steal this data, someone else will".
And adhering to the law would kill my thriving "pay me a dollar and I allow you to club a billionaire to death"-business. So what?
Or maybe the solution is in dissolving the socio-economic class hierarchy, which can only exist as an epistemic paperclip maximizer. Rather than also kneecapping useful technology.
I feel much of the critique and repulsion comes from people without much knowledge of either art/art history, or AI.
Nor even the problems and history of socio-economic policies.
Monkeys just want to be angry and throw poop at the things they don't understand. No conversation, no nuance, and no understanding of how such behaviours roll out the red carpet for continued 'elite' abuses that shape our every aspect of life.
The revulsion is justified, but misdirected. Stop blaming technology for the problems of the system, and start going after the system that is the problem.
IMO tech bros' main goal for this technology is to use it to manipulate public opinion on social media. It is perfect for it and the "daydreaming" (bullshitting) is perfect.
Notice that all social media are involved in it, Twitter was "sold" to xAI, the recent incident with Grok about South African apartheid. The 10 year ban to regulate it by states etc.
They talk about it increasing productivity (and are hoping that it could be used for that too), but if people would know it is meant for disinformation, they would be even more against skipping copyright for it.
From a copyright perspective, you don't need to ask for permission to train an AI. It's no different than taking a bunch of books you bought second-hand and throwing them into a blender. Since you're not distributing anything when you do that you're not violating anyone's copyright.
When the AI produces something though, that's when it can run afoul of copyright. But only if it matches an existing copyrighted work close enough that a judge would say it's a derivative work.
You can't copyright a style (writing, art, etc) but you can violate a copyright if you copy say, a mouse in the style of Mickey Mouse. So then the question—from a legal perspective—becomes: Do we treat AI like a Xerox copier or do we treat it like an artist?
If we treat it like an artist the company that owns the AI will be responsible for copyright infringement whenever someone makes a derivative work by way of a prompt.
If we treat it like a copier the person that wrote the prompt would be responsible (if they then distribute whatever was generated).
no different than taking a bunch of books you bought second-hand and throwing them into a blender.
They didn't buy the books. They took them without permission.
A realistic take on the situation.
I fully agree, despite how much people hate AI, training itself isn't infringement based on how copyright laws are written.
I think we need to treat it as the copier situation, the person who is distributing the copyright infringing material is at fault, not the tool used to create it.
I agree with both of you but it's a bit more nuanced than that: what if someone not familiar with the original IPs asks for a 'space wizard' or an 'Italian plumber cartoon', it outputs Obi Wan or Mario, and they use it in their work? Who's getting sued by Disney or Nintendo?
I could fairly easily ask a human artist to draw me something that would infringe on a copyright for a character they had never even seen before. It would technically be against the law, but given that no other parties know about it, it's unlikely to ever get caught. The legal problems arise if I use that art in a visible fashion such that the copyright holder would find out, and then it would be me getting sued, not the artist.
Oh no, we can't make slop without stealing? The horror!
Aah, cut off the crap Clegg.
Under the current IP and copyright laws across the world, your argument has as much merit as "let robbers steal; asking others permission before snatching their stuff would 'kill' the black market".
And, if the idiotic laws are to be revoked, they should be revoked for everyone. This would allow you to train your bloody models on those artists - but it would also allow everyone to grab things from your "industry" and you should not be able to do anything against them.
Pick a choice. One of those two. Enough of this bloody Bob Dylan defence* - either everyone is a thief, or everyone is a king; your industry is not such special snowflake, and you are not a higher caste above the rest of us dammit.
* Steal a little and they throw you in jail / Steal a lot and they make you king
If your industry depends on theft to survive maybe your industry shouldn't exist? Just a thought
It wouldn't kill it. It would just force them to use public domain and CC content like the rest of us
I can't make my Mini-mouse with hunky Millhouse crossover sex art without a C&D and psychology recommendation, but sure, these rich assholes can steal the works of others. Somehow their business is more important than the rights of the people that they would definitely sue if you showed more than 3s of copyrighted work that they owned...
I'd urge people watch and read Alex Avila’s essay on how the bourgeoisie are controlling the internet through anti AI campaigning.
It's pretty damning, and it's all about control. Larceny is wrong indeed, but AI corporations are NOT being charged with intellectual property “theft,” which is the crux of this counter capitalist campaign.
If the rich wanted AI regulated, they would have made the politicians they own pass those policies.
Which they have… see the video, it's full of citations and collusion.
I sat through the whole 3 hour video and at no point does he say anything like you described.
It only talks about how AI is bad, ranging from the loss of labor in a system where we must perform labor for survival, to how ofloading our creativity onto a machine kills the very spirit of humanity.
What a waste of time.
I'm blocking you now.
Jemmy
OfCourseNot

Sounds okay to me. Fuck the AI industry.
Asking banks for permission would kill the robbery imdustry.
So it's only piracy when poor people do it?
Poor people aren't "running an industry"
Napster was…but it wasn’t owned by rich people and didn’t benefit rich people so it was targeted.
Deleted by author
Deleted by author
Deleted by author
Deleted by author
Deleted by author
Deleted by author
Deleted by author
Kill the artist industry. Oh yeah, rich cunts in England think art isn't a real career
So what's the downside?
It doesn't actually kill AI, it would just kill any open source solution and create a monopoly for the handful of companies that either have all the data or can afford to pay for it.
There's like 5 publishing houses, 3 record companies and a couple of websites that "own" almost all training material.
Lol lmao
That's only if they want to try tonprofit off of it.
Free use still allows you to train your own AI.
And the tech would be better if nobody profited off of AI
Copy left license would be the best scenario but I don't think that is what we are getting sadly. That being said, you need the big foundational models. The average individual cannot train their own AI from scratch.
You're saying that like it's a bad thing?
Then it deserves to die.
Isn't it true that historically when universities trained these things, they used their own data, "not randos on the internet" or "anything that wasn't nailed down" (random scraped copyrighted content).
GenAI could be ethical, it just, isn't, because corporations are assholes.
Don't tempt me with a good time
Then it should die.
AI has brought nothing of value to the table. Just another grift.
If anything, it's delgitimized the actually useful ML projects in medicine and data processing.
If it kills the ai industry thats fine. They can't survive in our current system with IP laws they shouldn't survive.
Nick Clegg can go fuck himself!
The solution presents itself!
scribbles notes Oh good!
Dunno who this guy is, but he's either a techbro or doesnt know what hes talking about. It will kill LLM's and the art bots that work by stealing data, but thats not how all AI is trained. Make actual AI and not theft-bot and you wont have that problem.
k
Promise?
Ok
You mean how laws stop the theft industry, and the murder industry... Like they should
That is how I feel about banks and money
Deleted by author
REALLY????
Took a minute to figure out that this wasn't https://en.m.wikipedia.org/wiki/Nick_Clegg 🙀
Edit: Okay turns out it is the nick the useless clegg i thought it was some namesake since the photo in article looked different to what i remember... Wow what a bastard...
...but it is that Nick Clegg
What what?! 😦 Looks like i lived under a rock 🤯
It is, unfortunately.
Oh wow 😯😳 okay i lived under a rock, did not see that happening 😞