Sure but I don’t think it should be the line between garbage and good. It can add value and push the overall piece, but that isn’t what the person is implying.
There are probably some really fine paper napkin art out there, and having it on a paper napkin most likely adds to it overall, but it’s different then saying all paper napkin pieces have more value then all generated images.
Some of us value authenticity. Plagiarism-powered hallucination engines have exactly none of that. The disturbed individual (or individuals) that painted the bathroom of my primary school with feces created something more artful than any AI slop could ever be.
Imagine arguing that flavor is what is important in a dish and not the type of knife used to cut the vegetables, and have someone respond he’d rather drink piss.
Its more like arguing a soulless robot should make your food built upon stolen recipes, not only are the recipes stolen but that robot cannot taste nor understand flavor. All it understands is the words of the recipes and sometimes not even that, it than needs to make new recipes without being able to taste it. Your food will taste as bland and souless as the robot who cannot taste it, even if it does taste good you’ll know its basically just a worse version based on stolen recipes.
I hate to be the one to break it to you, but a huge amount of the food that is eaten in the world is made by “robots”. It ain’t the Keebler Elves in those factories baking your vanilla sandwich cookies, that’s for sure.
Go watch any video on mass produced food and you’ll see that it is made by machines. Drinks are mixed, bottled and packed without any human intervention. You would have a hard time trying to find a dish that you eat that was not prepared in some part by soulless, tasteless machines.
AI are also still configured by humans, since they are the ones choosing which training data is used. So automatically generated art is still human art.
The needed training data is so enormous, it is not cherry picked by humans. Furthermore, transforming random data until it fits the given description enough according to a “neural network” that was statistically curve fitted to the training data, is not in any way human.
If AI art only used ethically-sourced data, there’d be a lot less objection to it.
I can say, for sure, that this isn’t true.
People still catch the exact same flak for using generative fill in Photoshop despite Adobe training their models on artwork with the explicit permissions (and compensation) of the artists involved in making the training data.
People treat every model like it has personally eaten ever drawing and macaroni painting that they’ve ever done.
Because if you use words that only have objective definitions then you can arbitrarily move your definition around if people come up with counter-examples.
It’s a way of creating an argument that means nothing and also can’t be argued against on Internet forums where there are no rules (unlike, say a debate stage or court room where you have to rationally prove your points).
Is it not possible that how something is made also elicits emotions and thoughts?
Sure but I don’t think it should be the line between garbage and good. It can add value and push the overall piece, but that isn’t what the person is implying.
There are probably some really fine paper napkin art out there, and having it on a paper napkin most likely adds to it overall, but it’s different then saying all paper napkin pieces have more value then all generated images.
Some of us value authenticity. Plagiarism-powered hallucination engines have exactly none of that. The disturbed individual (or individuals) that painted the bathroom of my primary school with feces created something more artful than any AI slop could ever be.
Imagine arguing that flavor is what is important in a dish and not the type of knife used to cut the vegetables, and have someone respond he’d rather drink piss.
Its more like arguing a soulless robot should make your food built upon stolen recipes, not only are the recipes stolen but that robot cannot taste nor understand flavor. All it understands is the words of the recipes and sometimes not even that, it than needs to make new recipes without being able to taste it. Your food will taste as bland and souless as the robot who cannot taste it, even if it does taste good you’ll know its basically just a worse version based on stolen recipes.
I mean I eat food made by a robot basically every day and it’s pretty good.
I hate to be the one to break it to you, but a huge amount of the food that is eaten in the world is made by “robots”. It ain’t the Keebler Elves in those factories baking your vanilla sandwich cookies, that’s for sure.
Go watch any video on mass produced food and you’ll see that it is made by machines. Drinks are mixed, bottled and packed without any human intervention. You would have a hard time trying to find a dish that you eat that was not prepared in some part by soulless, tasteless machines.
Those robots were still configured by humans to produce a product the humans designed.The automatically produced food is still human food.
AI are also still configured by humans, since they are the ones choosing which training data is used. So automatically generated art is still human art.
The needed training data is so enormous, it is not cherry picked by humans. Furthermore, transforming random data until it fits the given description enough according to a “neural network” that was statistically curve fitted to the training data, is not in any way human.
what if the knife were made out of the skulls of infants
Then that is a fucked up knife, but doesn’t change anything about the dish.
The argument is: the dish requires the use of the fucked-up knife.
If AI art only used ethically-sourced data, there’d be a lot less objection to it.
I can say, for sure, that this isn’t true.
People still catch the exact same flak for using generative fill in Photoshop despite Adobe training their models on artwork with the explicit permissions (and compensation) of the artists involved in making the training data.
People treat every model like it has personally eaten ever drawing and macaroni painting that they’ve ever done.
Why does AI art have “no authenticity”?
Because if you use words that only have objective definitions then you can arbitrarily move your definition around if people come up with counter-examples.
It’s a way of creating an argument that means nothing and also can’t be argued against on Internet forums where there are no rules (unlike, say a debate stage or court room where you have to rationally prove your points).