• laenurd@lemmy.lemist.de
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    9 months ago

    “Retard” who bought Nvidia here.

    I know it’s 4chan banter and generally agree with anons points, but here goes:

    • ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff
    • I have yet to hear from anyone with an 8GB card who maxes out that memory on current-gen games at 1080p
    • apart from frame generation, you DO get DLSS 3 features on 3000 series cards
    • PrivateNoob@sopuli.xyz
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      9 months ago

      “Based” who bought AMD here.

      ROCM is still in it’s infancy stage. Literally ROCM isn’t supported for my 6700 XT, so I had to return to Google Colab to work on my AI thesis project.

      • averyminya@beehaw.org
        link
        fedilink
        arrow-up
        6
        ·
        9 months ago

        And this the issue. You pretty much have to buy a GPU for your specific needs with what’s currently available.

        Talking shit about people who bought a graphics card support 3 years ago almost to the day is pretty senseless. A 3080 vs the 5700 XT one has support then and the other is just barely getting workarounds for certain use cases.

        OTOH 3 years ago if that same NVIDIA buyer were a Linux user they benefitted then and it’s likely it’s better off today. AMD has gotten lots of driver updates for Linux and NVIDIA has probably pushed 3 changes, lol.

        If you want to do AI, easily, you’re just going to want to get NVIDIA. It sucks, but that’s just the case unless you want to jump through hoops for what can be a simple installation. I helped my friends get Stable Diffusion running on his 7900xtx but compared to my installation it was a huge PITA and still ran slower I/t than my 3080.

        Granted, using Windows and neither of us are Linux aficionados.

        And if you want gaming and pretty much anything else, then AMD is a fine equivalent, as long as the player is okay with no efficient RayTracing and no DLSS. I was interested in those features so I got the 3080 a few years ago. My friend wasn’t as interested and went for the 7900XT. I can understand disliking aspects of a company but I don’t understand the intensity from anyone going all in on one or the other. Like, AMD is doing great things, it’s a bonus that it’s pretty ethical as far as we know. NVIDIA is also doing great things pushing the software side of hardware, they’ve just got a stint of bad leadership and have made some dumb decisions. Just like 10 years ago, NVIDIA doesn’t look the same as it does today, and 10 years from now will likely be the same.

      • gaiussabinus@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        ROCM support makes me angry but NVidia also fumbled their drivers too. Their is no good option so pick your poison. I run ROCM right now with a work around on my 6900 XT to get the card detected. And i have also gone from 10 It/s to 4 or even 2 with updates. Shit sucks.

    • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      9 months ago

      ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff

      Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn’t getting much love from GPU manufacturers. But right now, I know for sure I won’t ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it’s also future-proof since you’re almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace “hip” with “cuda” in your code + some magic constants (warp length in particular).

    • RealFknNito@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      9 months ago

      Don’t put it in quotes. You know what you did.

      8GB of VRAM is fine for right now. Obviously nobody is complaining at 1080p but not a lot of people buying top of the line GPUs for a standard 1080p. I run 1440p and I know a good chunk of people who are hitting FPS limits of 240 that are now upgrading resolutions. 8GB of VRAM is like 16GB of RAM. Yes, most modern titles will work flawlessly but that isn’t going to be the case in the very near future especially if you plan of upgrading resolution.

      DLSS isn’t something to scoff at, I’ll give green credit there, but relying on it to make up for the shortcomings seems like cope. They made a stupid call in order to slash prices and you’re going to feel it later even with DLSS.

      If you need Nvidia for niche applications to do work and make money, get that bread king, but if you’re just a gaming enthusiast there’s no reason to subject yourself to a company that fucking hates you.

  • CheeseNoodle@lemmy.world
    link
    fedilink
    arrow-up
    6
    ·
    9 months ago

    “Not get dlss3.0 features on your 30 series”
    But DLSS 3.0 features do work on the 30 series? and the 20 series. The only thing locked out is frame gen.

  • boletus@sh.itjust.works
    link
    fedilink
    arrow-up
    3
    ·
    9 months ago

    Buy whatever card you need for your use case. Both are fine.

    For me, as a gamer, I think dlss is good shit, and nothing really beats it Rn. Also I like using rtx in single player games, I only expect 60-90fps from games anyway.

    I’m a game developer, I benefit by using a nvidia card because i have greater access to current standard apis and graphics features, hardware acceleration for light baking, the option to use tensor cores for learning how to write shit for it, and it generally has better compatibility with dev tools.

    Nvidia cards also tend to keep their value more, at least down under.

  • RealFknNito@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    12
    ·
    9 months ago

    I get the people who buy Nvidia because of the niche use case of AI or they have to use specific software for their work. I also get the people who are unable or unwilling to learn enough about computers to give a shit who makes their GPU. Those people get a pass. For now.

    However anyone who willingly chooses to support the PhysX Gsync Hairworks Teraflop proprietary cluster fuck palooza do not. It makes me physically ill any time I see them flop their limp dick on the table and exclaim they paid for their dogshit tools to be in yet another game so they can make the claim their overpriced bullshit is worth it because it technically works 6% better on that game you want. You don’t give a shit why it does but it does. The walled garden quickly becomes a prison.

    So yeah I buy AMD, viva la revolution, fuck anticonsumer bullshit.

    • Polar@lemmy.ca
      link
      fedilink
      arrow-up
      6
      arrow-down
      4
      ·
      9 months ago

      I want the best bang for my buck. I don’t want old ass shitty FSR or AMDs falling behind Ray Tracing. Also RTX Audio is crazy. Also Nvidia shadowplay, which has been around forever, is so handy.

      When AMD catches up, I’ll switch. I’m not brand loyal. I just want to play games and use the latest technology.

      • Dudewitbow@lemmy.ml
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        9 months ago

        Best bang for buck and caring about exclusive features are counter intuative. The brand with more features always cost more, and always try to set the market in their favor to prevent competitors from gaining ground, even of they have or do not have a better product at a given price segment.

        E.g anyone who bought a 3050 for 30% more than a 6600 when the latter was by default, 15%+ faster on average in a performance tier that raytracing is usually not viable.

      • RealFknNito@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        9 months ago

        Why care about Ray tracing? I get that it looks pretty for cinematic shots but for actual games it tanks performance so hard I’d literally never turn it on. Plus with UE5 coming with Lumen it doesn’t even seem like it can compare but I digress.

        I’m not brand loyal either, just hate anti consumer shit. Apple does it too. But suggesting Nvidia is giving you more bang for your buck… No sir or madam, that’s absolutely not the case. AMD is and has always been the price to performance solution.

        Nvidia will edge out top top performance, sure, but you’re overpaying by a lot. Some people have that kind of money but I sure as shit don’t. AMD have caught up, have made solid competitors to proprietary technologies, and you can even use them without an AMD card.

        AMD has in-house ‘moment capture’ like shadowplay but I’m unfamiliar with RTX audio. It won’t be a perfect switch obviously but if you do any poking around on value metrics for cards, you’ll see I’m not bullshitting you.

        • Polar@lemmy.ca
          link
          fedilink
          arrow-up
          5
          arrow-down
          2
          ·
          9 months ago

          Why care about Ray tracing? I get that it looks pretty for cinematic shots but for actual games it tanks performance so hard I’d literally never turn it on.

          I don’t know what games you’re playing, but when I turn ray tracing on, I go from 400fps to 120fps. Not sure why I’d ever want to disable ray tracing to get 400fps, when 120fps is plenty?

          But suggesting Nvidia is giving you more bang for your buck… No sir or madam, that’s absolutely not the case.

          AMD may be slightly cheaper, but you get worse ray tracing, FSR, no RTX audio, no good software like Shadowplay…

          All of those features are worth a bit more money. So yes, you get the best bang for your buck. Not sure why I’d want to “save” a bit of money if I am going to lose out on a ton of software features and a generational leap in ray tracing.

          • RealFknNito@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            5
            ·
            edit-2
            9 months ago

            So you don’t realize how going from 400 to 120 might impact you if you start at say 150 instead? It’s still a massive performance hit. If Ray tracing is an essential feature to you, have at it, I just personally never cared so much about lighting that it was the sole or even main reason for getting a card.

            My brother you’re paying for features, yes, including ones you may not want or use. Nvidia is charging you for shadowplay in the cost of the card, AMD just includes a nameless feature to do the exact same thing in the settings. Everything you’re using has an AMD comparison or a FOSS alternative. You are not limited by the hardware, you just like the convience of Nvidia streamlining it but you should at least be aware and honest about that.

            You’re willing to pay extra for them to make you a nice dinner but most people cook their own food because they can’t afford to eat out every night. Yeah your burger isn’t going to be as good as they can make it but you’re getting a solid alternative for less. AMD gives you the best value, Nvidia gives you cutting edge, and if AMD keep making power moves like threadripper that might not always be the case.

            • Polar@lemmy.ca
              link
              fedilink
              arrow-up
              1
              arrow-down
              2
              ·
              9 months ago

              Shut the fuck up about FOSS. Lemmy users are insufferable.

              Enjoy your generation behind AMD card with less features for a few bucks less. I’ll keep my Nvidia card that’s generations ahead for a bit more money. Thanks.

              • RealFknNito@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                arrow-down
                3
                ·
                9 months ago

                Lmao keep being a slave to proprietary dogshit, its not generations ahead, and enjoy overpaying for less. I can only try to help with advice, can’t teach fools.

                • Polar@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  1
                  arrow-down
                  2
                  ·
                  edit-2
                  9 months ago

                  AMD ray tracing and FSR are literally last gen…

                  Enjoy lying to yourself to feel better.

                  I don’t understand what you mean overpaying for less? I literally listed the stuff Nvidia has that AMD doesn’t. Sorry you can’t read. Can’t teach the illiterate.