Do PC gamers feel 12GB of VRAM is simply not enough for the money in 2024?

  • Evilcoleslaw@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    4
    ·
    edit-2
    5 months ago

    They’re beating AMD at ray tracing, upsampling (DLSS vs FSR), VR, and especially streaming (NVENC). For the latter look at the newly announced beta partnership with Twitch and OBS which will bring higher quality transcoding and easier setup only for Nvidia for now and soon AV1 encoding only for Nvidia (at first anyway).

    The raw performance is mostly there for AMD with the exception of RT, and FSR has gotten better. But Nvidia is doing Nvidia shit and using the software ecosystem to entrench themselves despite the insane pricing.

    • mihies@kbin.social
      link
      fedilink
      arrow-up
      10
      arrow-down
      4
      ·
      5 months ago

      And they beat AMD in efficiency! I’m (not) surprised that people ignore this important aspect which matters in noise, heat and power usage.

      • MonkderZweite@feddit.ch
        link
        fedilink
        English
        arrow-up
        21
        arrow-down
        1
        ·
        5 months ago

        Toms Hardware did a test, Rx 6800 is leader there. Next, RTX 3070, is 4.3% worse. Are their newer cards more efficient than AMD’s newer cards?

        • pycorax@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          ·
          5 months ago

          They seem to be but honestly, this generation hasn’t been very impressive for both team green and red. I got a 6950 XT last year and seeing all these new releases has only proven that I made a good investment.

          • Daveyborn@lemmy.world
            cake
            link
            fedilink
            English
            arrow-up
            2
            ·
            5 months ago

            Nothing compelling enough for me to hop off of a titan Xp yet. (Bought a titan because it was cheaper than a 1070 at the time because of scalpers)

        • Crashumbc@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          5 months ago

          30 series maybe.

          40 series power usage Nvidia destroys AMD.

          The 4070 uses WAY less than a 3070… It’s 200 (220 for supera) that’s nearly more than my 1070 170w

    • umbrella@lemmy.ml
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      5 months ago

      Streaming performance is really good on AMD cards, IME. Upscaling is honestly close and getting closer.

      I dont think better RT performance is worth the big premium or annoyances nvidia cards bring. Doubly so on Linux.

      • Evilcoleslaw@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        5 months ago

        True enough. I was thinking more of the gaming use case. But even beyond AI and just a general compute workload they’re beating the pants off AMD with CUDA as well.