• laenurd@lemmy.lemist.de
    link
    fedilink
    arrow-up
    22
    arrow-down
    2
    ·
    9 months ago

    “Retard” who bought Nvidia here.

    I know it’s 4chan banter and generally agree with anons points, but here goes:

    • ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff
    • I have yet to hear from anyone with an 8GB card who maxes out that memory on current-gen games at 1080p
    • apart from frame generation, you DO get DLSS 3 features on 3000 series cards
    • PrivateNoob@sopuli.xyz
      link
      fedilink
      arrow-up
      11
      arrow-down
      2
      ·
      9 months ago

      “Based” who bought AMD here.

      ROCM is still in it’s infancy stage. Literally ROCM isn’t supported for my 6700 XT, so I had to return to Google Colab to work on my AI thesis project.

      • averyminya@beehaw.org
        link
        fedilink
        arrow-up
        6
        ·
        9 months ago

        And this the issue. You pretty much have to buy a GPU for your specific needs with what’s currently available.

        Talking shit about people who bought a graphics card support 3 years ago almost to the day is pretty senseless. A 3080 vs the 5700 XT one has support then and the other is just barely getting workarounds for certain use cases.

        OTOH 3 years ago if that same NVIDIA buyer were a Linux user they benefitted then and it’s likely it’s better off today. AMD has gotten lots of driver updates for Linux and NVIDIA has probably pushed 3 changes, lol.

        If you want to do AI, easily, you’re just going to want to get NVIDIA. It sucks, but that’s just the case unless you want to jump through hoops for what can be a simple installation. I helped my friends get Stable Diffusion running on his 7900xtx but compared to my installation it was a huge PITA and still ran slower I/t than my 3080.

        Granted, using Windows and neither of us are Linux aficionados.

        And if you want gaming and pretty much anything else, then AMD is a fine equivalent, as long as the player is okay with no efficient RayTracing and no DLSS. I was interested in those features so I got the 3080 a few years ago. My friend wasn’t as interested and went for the 7900XT. I can understand disliking aspects of a company but I don’t understand the intensity from anyone going all in on one or the other. Like, AMD is doing great things, it’s a bonus that it’s pretty ethical as far as we know. NVIDIA is also doing great things pushing the software side of hardware, they’ve just got a stint of bad leadership and have made some dumb decisions. Just like 10 years ago, NVIDIA doesn’t look the same as it does today, and 10 years from now will likely be the same.

      • gaiussabinus@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        9 months ago

        ROCM support makes me angry but NVidia also fumbled their drivers too. Their is no good option so pick your poison. I run ROCM right now with a work around on my 6900 XT to get the card detected. And i have also gone from 10 It/s to 4 or even 2 with updates. Shit sucks.

    • AlmightySnoo 🐢🇮🇱🇺🇦@lemmy.world
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      9 months ago

      ROCM wasn’t a thing when I bought. You need(ed) NVidia for machine learning and other GPGPU stuff

      Same for me, had to buy an Alienware laptop with an NVIDIA GPU during my PhD for some GPGPU coding I had to do as CUDA was pretty much the only choice back then and OpenCL was a joke in terms of performance and wasn’t getting much love from GPU manufacturers. But right now, I know for sure I won’t ever buy an NVIDIA GPU again, ROCm works wonderfully well even on an APU (in my case, a Radeon 680M integrated GPU) and it’s also future-proof since you’re almost writing CUDA code so if you ever switch to an NVIDIA GPU again you mostly will just have to replace “hip” with “cuda” in your code + some magic constants (warp length in particular).

    • RealFknNito@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      9 months ago

      Don’t put it in quotes. You know what you did.

      8GB of VRAM is fine for right now. Obviously nobody is complaining at 1080p but not a lot of people buying top of the line GPUs for a standard 1080p. I run 1440p and I know a good chunk of people who are hitting FPS limits of 240 that are now upgrading resolutions. 8GB of VRAM is like 16GB of RAM. Yes, most modern titles will work flawlessly but that isn’t going to be the case in the very near future especially if you plan of upgrading resolution.

      DLSS isn’t something to scoff at, I’ll give green credit there, but relying on it to make up for the shortcomings seems like cope. They made a stupid call in order to slash prices and you’re going to feel it later even with DLSS.

      If you need Nvidia for niche applications to do work and make money, get that bread king, but if you’re just a gaming enthusiast there’s no reason to subject yourself to a company that fucking hates you.