• Angry_Autist (he/him)@lemmy.world
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    2 days ago

    Not really, LLMs can’t keep the plot for more than a few paragraphs and still constantly context switch.

    Add on top of that we are currently at 90% saturation for what is possible with LLM technology, meaning that the brightest minds in AI have realized that there is no singularity curve approaching infinity but a bottleneck over the cross-indexing of tokens. The more tokens, the exponentially more connections need to be made.

    Adding a new token means greater overhead and while our technology grows linearly, the cost per token grows exponentially and that is an unsustainable curve limited by the total global possible processing and we’d need to triple that before the next generation of LLMs can have enough tokens to make a difference in quality

    Add on top of that I’ve got 30 years max left to live and I’m pretty confident we won’t see writers lose out in that time.