Not necessarily, its just a tool like anything else. Just depends how much its been used and how effectively. I havent been on steam in a long time to see this label popup in the wild, but I suspect it’ll need more nuance to be effective.
“I used AI to write the whole story and do all the voice acting” is gonna suck.
“I used AI to help with the scripting because I suck at coding” might be the greatest game ever just from an individual who doesnt have the skills themselves.
“I used AI to write the whole story because I enjoy programming more” doesn’t sound great either no?
It’s not a “buggy game vs bad story” comparison either. They sounds bad because they both mean “developers do NOT want to make games.”
If they are constantly wanting to cut corners outcome will still be worse in comparison.
Also even if the most responsible person just “used it as a tool,” they would still have to learn all that scripting to fix bugs or make a sequel, and because it’s someone elses code they would struggle more.
I’m not really interested in game development, but I needed another side hustle, and I saw a video on YouTube that told me how to publish AI generated games onto Steam with minimal human input. I can’t wait until I get my first Ferrari, so I won’t be a pathetic man in the eyes of Andrew Tate and most women.
I understand what you mean, I’m saying you’re misunderstanding how ai messing the code up is much more important than it messing up a plotline or giving a character 6 fingers. AI currently isn’t good enough to write flawless code, and you can’t just use ai to code a game without having any prior code experience, you’d have to vet every process. There’s no chance in hell you’ll make the best game ever, as your characters will be going through walls and your objects will be floating or any other countless number of glitches that could occur, let alone the negative effects bad code can have on the hardware that’s trying to run it.
That’s not the point, in the commenters example the mentioned a person who doesn’t know how to code, and now matter how you cut it ai right now wouldn’t be able to code well enough to ensure no bugs occur, you would still need to check it in your self entirely lest some massive issue occur, not just in the game but with your hardware that’s trying to run it. That’s way more important to deal with than a story line being off.
AI generated content is what ought to be disclosed, and even then it’s not necessarily a bad thing, though I can see how it might often be. But AI in general encompasses a broad range of tools which is bound to get broader and more ubiquitous with time.
Which is also why the term AI is fucking worthless and should get marketers fucking hanged. Seriously if they had to actually explain what it fucking did this would not be nearly as much of an annoying problem as it is.
If you are against such tools as the one used in the linked video, I think you should also stand very much against Photoshop allowing people to paint without using actual pigments and oil.
Weak comparisons help no-one, photoshop is nothing like LLM’s
All of the big commercial LLM’s (without exception afaik) have been trained on a large corpus of data that has been obtained by various sketchy and illegitimate means. (some legitimate as well).
That’s the major difference between the two.
If you are using a model that has only been trained on legally obtained data, disregard this point.
I’m not even against competent tool use of LLM’s but please use better arguments.
Weak comparisons help no-one, photoshop is nothing like LLM’s
Then people need to specify that they’re against generative LLMs, like Chat-bots or slop-generators, not “all AI”.
There was just a thread on Twitter where a company showcased an amazing tool for animators - where you, for example, prepare your walking/sitting/standing animations, but then instead of motion-capturing or manually setting the scene up, you just define two keyframes - the starting and the ending position of the character… and then their AI picks the appropriate animations, merges between them and animates the character walking from one position to the other.
It’s a phenomenal tool for creatives, but because the term “AI” appeared, the company got shat on by random people.
All of the big commercial LLM’s (without exception afaik) have been trained on a large corpus of data that has been obtained by various sketchy and illegitimate means
No. All generative graphical slop AIs and generic chat-bot LLMs have been trained on large corpus of data that has been obtained by various sketchy and illegitimate means.
THAT’S the major difference.
If you are using a model that has only been trained on legally obtained data, disregard this point.
I’m not even against competent tool use of LLM’s but please use better arguments.
And yet, the guy I was responding to wrote:
So I’m gonna execute the code of someone who doesn’t know the first thing about coding on my computer? Great!
I’d rather have AI art and human code.
So, he basically says something that directly contradicts what you’re saying - he prefers the generative slop machines, than tools that actually help developers or artists.
Not necessarily, its just a tool like anything else. Just depends how much its been used and how effectively. I havent been on steam in a long time to see this label popup in the wild, but I suspect it’ll need more nuance to be effective.
“I used AI to write the whole story and do all the voice acting” is gonna suck.
“I used AI to help with the scripting because I suck at coding” might be the greatest game ever just from an individual who doesnt have the skills themselves.
There isn’t something you can use AI for that a person wouldn’t do better.
It’s just going to be worse than a person putting in effort every time.
Seems like a very personal take ngl
“I used AI to write the whole story because I enjoy programming more” doesn’t sound great either no?
It’s not a “buggy game vs bad story” comparison either. They sounds bad because they both mean “developers do NOT want to make games.”
If they are constantly wanting to cut corners outcome will still be worse in comparison.
Also even if the most responsible person just “used it as a tool,” they would still have to learn all that scripting to fix bugs or make a sequel, and because it’s someone elses code they would struggle more.
You forgot the:
It’s worrying how you think the writing and voice acting are more important than the code in a video game.
Im not sure ill be able to explain myself clearly enough for you to understand
I understand what you mean, I’m saying you’re misunderstanding how ai messing the code up is much more important than it messing up a plotline or giving a character 6 fingers. AI currently isn’t good enough to write flawless code, and you can’t just use ai to code a game without having any prior code experience, you’d have to vet every process. There’s no chance in hell you’ll make the best game ever, as your characters will be going through walls and your objects will be floating or any other countless number of glitches that could occur, let alone the negative effects bad code can have on the hardware that’s trying to run it.
Also all the gameplay mechanics will be more generic and bland than a modern ubisoft game.
LLMs can’t exactly help code “unique” ideas you came up with.
I mean yeah if no parts of a game work then it will suck. Good point.
Difference between what’s more important and what’s more feasible for AI to do well
That’s not the point, in the commenters example the mentioned a person who doesn’t know how to code, and now matter how you cut it ai right now wouldn’t be able to code well enough to ensure no bugs occur, you would still need to check it in your self entirely lest some massive issue occur, not just in the game but with your hardware that’s trying to run it. That’s way more important to deal with than a story line being off.
AI generated content is what ought to be disclosed, and even then it’s not necessarily a bad thing, though I can see how it might often be. But AI in general encompasses a broad range of tools which is bound to get broader and more ubiquitous with time.
Which is also why the term AI is fucking worthless and should get marketers fucking hanged. Seriously if they had to actually explain what it fucking did this would not be nearly as much of an annoying problem as it is.
So I’m gonna execute the code of someone who doesn’t know the first thing about coding on my computer? Great!
I’d rather have AI art and human code.
I don’t get why you have to go to such extremes here.
AI is an extremely broad spectrum of tools. Some of them, yes, use stolen graphics to generate derivative graphics. Some of them attempt writing code.
But others let you create things that would normally require hundreds of thousands of dollars while still retaining the necessary creative input from the author.
If you are against such tools as the one used in the linked video, I think you should also stand very much against Photoshop allowing people to paint without using actual pigments and oil.
Weak comparisons help no-one, photoshop is nothing like LLM’s
All of the big commercial LLM’s (without exception afaik) have been trained on a large corpus of data that has been obtained by various sketchy and illegitimate means. (some legitimate as well).
That’s the major difference between the two.
If you are using a model that has only been trained on legally obtained data, disregard this point.
I’m not even against competent tool use of LLM’s but please use better arguments.
Then people need to specify that they’re against generative LLMs, like Chat-bots or slop-generators, not “all AI”.
There was just a thread on Twitter where a company showcased an amazing tool for animators - where you, for example, prepare your walking/sitting/standing animations, but then instead of motion-capturing or manually setting the scene up, you just define two keyframes - the starting and the ending position of the character… and then their AI picks the appropriate animations, merges between them and animates the character walking from one position to the other.
It’s a phenomenal tool for creatives, but because the term “AI” appeared, the company got shat on by random people.
No. All generative graphical slop AIs and generic chat-bot LLMs have been trained on large corpus of data that has been obtained by various sketchy and illegitimate means.
THAT’S the major difference.
And yet, the guy I was responding to wrote:
So, he basically says something that directly contradicts what you’re saying - he prefers the generative slop machines, than tools that actually help developers or artists.
Art cannot destroy my system. Bad code which might ask for elevated access can…