• Zarxrax@lemmy.world
    link
    fedilink
    arrow-up
    9
    ·
    6 months ago

    28GB? Are they really that stingy that can’t can’t pull the trigger on 32gb for their top of the line card? The 3090 had 24gb, the 4090 still had 24gb, and now they add a measly 4gb more? This tells me that they are probably going to keep vram mostly the same on the lower cards as well.

    • Alphane Moon@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      5
      ·
      6 months ago

      I believe this is tied to the memory bus size (which is tied to the overall architecture design).

      I believe 448 bit memory bus doesn’t actually allow 32GB, I think it would have to be 56 GB.

  • DavidGarcia
    link
    fedilink
    arrow-up
    4
    ·
    6 months ago

    I wish they would release cheapish cards with huge cheap VRAM for consumer AI. But I guess even the cheapest memory is too expensive for that.