• seikoshadow@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Honestly I recently purchased a new RTX 4070 TI and it’s absolutely dominated everything I’ve chucked at it. But it was very expensive for what it was and I can completely understand why people have issues with the latest generations

    • dwindling7373@feddit.it
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      What do you even mean by “dominated everything”? It saved you significant time in your workflow? It made games run extrasupermegagood?

      1080p 60fps is plenty prove me wrong.

      • setInner234@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        Hmmm, your tone is a bit edgy, but perhaps it was unintentional. The difference between 120fps and 60fps is pretty huge to me. I once had a 4k monitor (on 1440p now), and played on my other 1080p one instead, just for more FPS. Isn’t it a question of preference? Some people prioritise image quality over FPS, some do the opposite. Either is fine, no need to ‘prove anyone wrong’…

        • dwindling7373@feddit.it
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          1 year ago

          Everything is a matter of preference. I just think the industry manifactured a desire for diminishing improvements (fps, resolution) to drive sales while good and bad games are mostly enjoyed on the basis of being good games.

          Sure you can tell 120 fps from 60. Will you notice it while playing? Unlikely.

          • setInner234@feddit.de
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            You are completely right that the games industry is a joke, and it’s a tale as old as time that hardware manufacturers love unoptimised shit, so that they can sell more expensive crap.

            As for the 60fps, or should I call it saturating a 60hz display, I have noticed that some games are fine at 60hz and some games feel terrible up until the higher 90s, and around that level I’m usually fine.

            I used to play a lot of quake 3 back in the day and going from 60 to 120 is like two different worlds entirely.

            I think some people pick up on it and others don’t. I used to work in an office where all the monitors were connected via 4k over HDMI 1 and therefore they were all 30hz. Out of a team of 50, only one graphic designer complained about the laggy monitors and everyone else was moving their mice around saying they couldn’t tell.

            To me it was torture. I don’t know where the truth in the matter is. I think console manufacturers long tried to convince everyone that the eye can’t perceive over 30hz, which is insane.

            Maybe now that’s shifted to trying to make everyone believe they need 240hz, but obviously you’re getting diminishing returns at that level. I’ve never seen more than 144 myself, but even in my own testing, I find more than 100 imperceptible. So I know where my personal limit lies.

            For others, maybe younger people, those limits could be higher, who knows. You often hear of people saying they can see monitors and LEDs flicker. I rarely can.

            Back in the 90s I used to play games at 20fps at 640x480, so perspectives can also shift rather dramatically lol

            Lastly, I can only reaffirm that I’d much rather have well optimised, well designed games with a beautiful art direction than the latest SSAO implementation. Beautiful games from 10 years ago are still beautiful games. Whereas path tracing can’t fix your hot pile of AAA garbage…