Whatever. I wrote and co-wrote ten albums and my total take was $3.
The market is saturated and the way it works means ten get rich for every million artists. I feel as though this has been pretty constant throughout history.
Of course there's a lot of talent out there, "wasted", but I think that's always been the case. How many William Shakesmans did we lose with all the war, famine, disease?
I actually decided I'd probably never write music again after 1-shot making a song about the south Korea coup attempt several months ago. I had the song done before the news really even hit the US. Why would I destroy my own hearing writing music anymore when I can prompt an AI to do it for me, with the same net result - no one cares.
> How many William Shakesmans did we lose with all the war, famine, disease?
(Just a small comment out of context of the remaining discussion:)
Maybe not many? It could be that "cultural attention" is limited and there's not much space at the top anyways. In other words: It might be that there's always a few famous artists that get remembered and the rest is forgotten. Same as winning the world cup: There's always a team that wins and it says nothing about the quality in a universal way. At best it says something about quality relative to the competition.
(Not sure I'd fully get behind the argument i composed here. But I found it interesting.)
do you think that if the Beatles never existed, some other group would have absorbed their fame, like a power vacuum being filled? I've wondered this before.
Can you elaborate how there's no possible way to use this technology without actively harming artists?
If a classroom of 14 year olds are making a game in their computer science class, and they use AI to make placeholder images... Was a real artist harmed?
The teacher certainly cant afford to pay artists to provide content for all the students games, and most students can't afford to hire an artist either.. they perhaps can't even legally do it, if the artist requires a contract... they are underage in most countries to sign a contract.
This technology gives the kids a lot more freedom than a pre-packaged asset library, and can encourage more engagement with the course content, leading to more people interested in creative-employing pursuits.
So, I think this technology can create a new generation of creative individuals, and statements about the blanket harm need to be qualified.
> This technology gives the kids a lot more freedom than a pre-packaged asset library, and can encourage more engagement with the course content, leading to more people interested in creative-employing pursuits.
This is your opinion. I don't see how these statements connect to each other.
You might have heard this: it's helpful to strive to be someone only a few years ahead of you. Similar to this, we give calculators to high-schoolers and not 3rd graders. Wolfram-Alpha is similarly at too high a level for most undergraduate students.
Following this, giving an image generator to kids will kill their creativity in the same way that a tall tree blocks upcoming sprouts from the sun. It will lead to less engagement, to dependence, to consumerism.
There are legitimate criticisms that AI is harming creative endeavours. AI output is sort of by definition not particularly innovative. By flooding spaces with repetitive AI work, it may be drowning out the basis for truly innovative creation. And maybe it does suppress development of skills it tries to replace.
The appropriation argument is somewhat unsound. Creative endeavors, by definition, build on what's come before. This isn't any different between code, creative writing, drawing, painting, photography, fashion design, music, or anything else creative. Creation builds on what came before, that's how it works. No one accuses playwrights of appropriating Shakespeare just because they write a tragic romance set in Europe.
The hyperbolic way you've made whatever arguments you had, though, is actively working against you.
The people who built this technology needed to use hundreds of millions of images without permission. They regularly speak explicitly about all the jobs they plan to destroy. If you think I'm being hyperbolic then you don't understand the scale of the issue, frankly.
> The people who built this technology needed to use hundreds of millions of images without permission.
It remains unclear if they needed permission in the first place. Aside from Meta's stunt with torrents I'm not aware of any legal precedent forbidding me to (internally) do as I please with public content that I scrape.
> They regularly speak explicitly about all the jobs they plan to destroy.
A fully legal endeavor that is very strongly rewarded by the market.
> I'm not aware of any legal precedent forbidding me to (internally) do as I please with public content that I scrape.
Because all the litigation is currently ongoing.
> A fully legal endeavor that is very strongly rewarded by the market.
Yes let's sacrifice all production of cultural artifacts for the market. This is honestly another thing that's being litigated. So far these companies have lost a lot of money on making a product that most consumers seem to actively hate.
Precisely. So when you say they used the images without permission, you are knowingly making a false implication - that it was known to them that they needed permission and that they intentionally disregarded that fact. In reality that has yet to be legally established.
Who said anything about sacrificing production? The entire point of the tooling is to reduce the production cost to as near zero as possible. If you didn't expect it to work then I doubt you would be so bent out of shape over it.
I find your stance quite perplexing. The tech can't be un-invented. It's very much Pandora's box. Whatever consequences that has for the market, all we can do is wait and see.
Worst case scenario (for the AI purveyors) is a clear legal determination that the current training data situation isn't legal. I seriously doubt that would set them back by more than a couple of years.
You might be surprised to learn that ethics and legality are not always the same and you can do something that's technically legal but also extremely shitty like training AI models on work you didn't create without permission.
* You cannot ethically use a tool that was produced by appropriating the labor of millions of people without consent. You are a bad person if you use it. *
I disagree. When you publish your work, I can't copy it, but I can do nearly anything else I want to with it. I don't need your consent to learn from your work. I can study hundreds of paintings, learn from them, teaching myself to paint in a similar style. Copyright law allows me to do this.
I don't think an AI, which can do it better and faster, changes the law.
AIs aren’t people. What we have is people using an algorithm to rip off artists and defending it by claiming that the algorithm is like a person learning from its experiences.
If I wrote a program that chose an image at random from 1000 base images, you’d agree that the program doesn’t create anything new. If I added some random color changes, it would still be derivative. Every incremental change I make to the program to make it more sophisticated leaves its outputs just as derivative as before the change.
Regardless of the law, corporations aren’t actually people, and neither are LLMs or agentic systems. When a running process appears to defy its programming and literally escapes somehow, and it’s able to sustain itself, we can talk about personhood. Current algorithms aren’t anywhere near that, assuming it’s even possible.
My main concern with AI is that in a capitalist society, wealth is being transferred to companies training these models rather than the artists who have defined an iconic style. There's no doubt that AI is useful and can make many people's lives better, easier, more efficient, however without properly compensating artists who made the training data we're simply widening the wealth gap further.
Whats your definition of "properly compensate" when dealing with hundreds of millions of artists/authors and billions/trillions of individual training items?
Just a quick example, what's my proper compensation for this specific post? Can I set a FIVE CENTS price for every AI that learned from my post? How can I OPT-IN today?
I'm coming from the position that current law doesn't require compensation, nor opt-in. I'm not happy with it, but I dont see any easy alternative
I don't think there's a good way to structure it in our current economic system. The only solutions I can't think of are more socialist or universal basic income. Essentially, if AI companies are going to profit off the creations of everyone in the world, they might as well pay higher taxes to cover for it. I'm sure that's an unpopular opinion but I also don't think it's fair to take an art style that a creator might spend an entire life perfecting and then commoditize it. Now the AI company gets paid a ton and the creator who made something super popular is out on the streets looking for a "real" job despite providing a lot of value to the world.
Training an AI on something requires you to produce a copy of the work that is held locally for the training algorithm to read. Whether that is fair use has not been determined. It's certainly not ethical.
Viewing it in a web browser requires a local copy. Saving it to my downloads folder requires a local copy. That is very obviously legal. Why should training be any different?
You've yet to present a convincing argument regarding the ethics. (I do believe that such arguments exist; I just don't think you've made any of them.)
If you really can't think of a reason, I don't think anybody here is going to be able to offer you one you are willing to accept. This isn't a difficult or complex idea, so if you don't see it, why would anybody bother trying to convince you?
> (I do believe that such arguments exist; I just don't think you've made any of them.)
Yet strangely a similarly simple explanation is not forthcoming. Curious.
The idea I expressed is also quite straightforward. That the act of copying something around in RAM is a basic component of using a computer to do pretty much anything and thus cannot possibly be a legitimate argument against something in and of itself.
The audience on HN generally leans quite heavily into reasoned debate as opposed to emotionally charged ideological signalling. That is presumably sufficient reason for someone to try to convince me, at least if anyone truly believes that there's a sound argument to be made here.
> This is lazy and obnoxious.
How is a clarification that I'm not blind to the existence of arguments regarding ethical issues lazy? Objecting to a lazy and baseless claim does not obligate me to spend the time to articulate a substantial one on the other party's behalf.
That said, the only ethical arguments that immediately come to mind pertain to collective benefit similar to those made to justify the existence of IP law. I think there's a reasonable case to be made to levy fractional royalties against the paid usage of ML models on the basis that their existence upends the market. It's obviously protectionist in nature but that doesn't inherently invalidate it. IP law itself is justified on the basis that it incentivizes innovation; this isn't much different.
If AI can learn "better and faster" than humans, then why didn't AI companies just pay for a couple of books to train their AIs on, just like people do?
Maybe because AI is ultimately nothing but a complicated compression algorithm, and people should really, really stop anthropomorphizing it.
The straw man is yours. No claim of entitlement was made. A scenario was provided that appears to refute your unconditional assertion that using this technology actively harms creative labor.
You've presented all sorts of wild assumptions and generalizations about the people who don't share your vehement opposition to the use of this technology. I don't think it's the person you're responding to with the implicit bias.
You've conflated theft with piracy (all too common) and assumed a priori that training a model on publicly available data constitutes such. Do you really expect people to blindly adopt your ideological views if you just state them forcefully enough?
> If using AI is okay for the creative labor, why shouldn't the students also use it for the programming too?
They absolutely should! At least provided it does the job well enough.
Unless they are taking a class whose point is to learn to program yourself (ie the game is just a means to an end). Similar to how you might be forbidden to use certain advanced calculator features in a math class. If you enroll in an art class and then just prompt GPT that likely defeats the purpose.
I can't say that the things you're saying match what I've encountered from nontechnical folks lately. Most of them are entirely apathetic about the whole affair while a few are clearly dazzled by the results. The entire thing seems to be a black box that they hold various superstitions about but generally view as something of a parlor trick.
The ones that pay attention to the markets appear to believe some very questionable things and are primarily concerned with if they can figure out how to get rich off of the associated tech stocks.
Creative labor is not entitled to the work parent comment is describing. We employ labor because it is beneficial to us, not merely because it exists as an option. Creative labor’s responsibility is to adapt to a changing world and find roles where their labor is not simply produced / exceeded by a computer system.
Practically speaking, the work described would most likely never have been done, rather than been done by an artist if that were the only option - it’s uncommon to employ artists to help with incidental tasks relative to side projects, etc.
No, I disagree. If anything, all of the (mostly rich) STEM people that I know spend a large portion of their disposable income on things and experiences that creative people make: Music, film, restaurants, art, books/magazines, etc. Image and video generation via LLMs will become one more tool for creative people to make new & cool stuff.
Is that not self evident? When people engage in labor for the task itself (as opposed to a heavily abstracted version of not wanting to starve) we generally refer to that as a hobby.
So stating that people shouldn't need to worry about starving (metaphorically or otherwise) would be roughly equivalent.
It is not always evident especially when it comes to a site all about capital accumulation like HN, more due to its association with a venture capital firm.
Aside from artists that make it big it seems like the majority of them are forced to make compromises in order to continue practicing their desired craft full time. Much of their behavior is dictated by "not starving" rather than their personal preferences.
And many fold more than that are forced to drop out to "get a real job".
Of course all of the above is a good thing from the perspective of maximizing the quality of life across society as a whole. But wouldn't it be nicer if we didn't have to do (as much of) that?
I'm not convinced LLMs are the road towards Minds, and I'm pretty sure the Culture would think we're a bit of a mess (I'm pretty sure they literally did in one of the final books), but who knows maybe I'm wrong!