• SuddenDownpour@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    A developer tells this anecdote from a project from the PS2 era. They showed a pretty late build of the game to their publisher, very few weeks before the game had to be ready to begin the distribution process, and the FPS appeared at a corner of the screen, never quite falling below 30FPS, but often making important jumps. The people from the publishing company said that “Everything about the game looks fine, except for the FPS. 30 FPS is unacceptable, and we cannot publish it if you can’t reach a consistent 60 FPS”.

    You don’t need to know much about development to understand that making such a demand weeks before the launch of a medium-sized project is asking the impossible, and so did this dev team. In the end, they changed the function that took the real calculation of the FPS so that it returned a fraction of the difference between 60 and the real FPS, so the next time the publisher took a look at the game, the FPS counter would always show a value between 58 and 60, even though they didn’t have the time to really optimize the game. The publisher didn’t notice the deception and the game was a commercial success.

  • Underwaterbob@lemm.ee
    link
    fedilink
    English
    arrow-up
    2
    ·
    11 months ago

    Meh. 60 is enough for me. I didn’t notice 144 being that much better than 60.
    30 can fuck right off though.

    • SgtAStrawberry@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      I can go down to 30 probably a bit lower as long as it is consistant, that is the most important part.

      It can also have a bit to do with me powering through Watch Dogs at 1 fram per second in some parts. You never notice how good 25-30 is until your frames starts camping in the singel digits.

  • Vespair@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    All you FPS kids are just doing the new version of “eww that game has 2d graphics; polygons or bust!!” from the PlayStation era.

    Yes, progress is cool and good, but no it’s not the end-all be-all and no not every game has to have bleeding edge FPS to be good.

    Like, we’re literally already done this shit guys; can’t we just learn from the past?

  • gmtom@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 months ago

    I get it’s a meme but I usually play on 144fps and when I go back to 60 fps I literally don’t notice a difference even down to like 40-45 I barely see much difference. 30 is noticeable and a bit shit, but my eyes get used to it after like 30 mins, so not a big deal.

    • yggdar@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 months ago

      Our eyes and brains don’t perceive still images or movement in the same way as a computer. There is no simple analogy between our perception and computer graphics.

      I’ve read that some things can be perceived at 1000 fps. IIRC, it was a single white frame shown for 1ms between black frames. Of course most things you won’t be able to perceive at that speed, but it certainly isn’t as simple as 30 fps!

    • TheSlad@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 months ago

      Also most monitors only go up to 60fps, and even if you have a fancy monitor that does, your OS probably doesn’t bother to go higher than 60 anyways. Even if the game itself says the fps is higher, it just doesn’t know that your pc/monitor isnt actually bothering to render all the frames…

      • fiah@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        11 months ago

        my man, just because you’ve never seen the refresh rate option in the monitor settings doesn’t mean it hasn’t been there since basically forever

      • pivot_root@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        11 months ago

        This is blatantly false.

        Windows will do whatever frame rate the EDID reports the display as being capable of. It won’t do it by default, but it’s just a simple change in the settings application.

        Macs support higher than 60 Hz displays these days, with some of the laptops even having a built-in one. They call it by some stupid marketing name, but it’s a 120 Hz display.

        Linux requires more tinkering with modelines and is complicated by the fact that you might either be running X or Wayland, but it’s supported as well.

  • Fosheze@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    arrow-down
    1
    ·
    11 months ago

    People always go on and on about framerate but I’d take 4k at 60fps over 1080p at 144fps any day. I never really noticed a differance over 60fps. But the resolution makes a massive difference.