• mustyOrange@beehaw.org
    link
    fedilink
    English
    arrow-up
    21
    ·
    1 year ago

    No shit. When 1080s from 6 years ago still work fine, there’s clearly some stagnation. They need to cut prices if they want people to actually buy their shit.

    Intel needs to come thru with Battlemage and fuck up team red and and team green

    • Pigeon@beehaw.org
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      I think it helps that AAA graphics got so realistic that improvements feel more incremental relative to older games, and indie games proved that much simpler, cheaper graphics are viable and often even preferred, and devs started going for stylized art over realism more often. Probably also helps that Steam Deck is a thing now, and the Switch allows 3rd party games, so that hardware can be a target to consider too.

      Anyway yeah. I’m still running a 1070, and at absolute worst I might have to reduce some graphics settings in the latest or most poorly optimized games, and we’re long past the days where moderate or even minimal graphics settings looked awful. Games are still beautiful on lower settings.

      A better GPU at this point would net me better FPS in some titles, but those games make up a relatively tiny proportion of what I play, and even then I still get a perfectly playable framerate as is.

      So, yeah, not paying those prices for a tiny upgrade, and not when I remember prices pre-covid and pre-crypto miners. I can afford to wait out their greed.

      • Katana314@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I keep explaining to people how the world actually kind of benefits from the Graphical Plateau; but so many insist to me “You will want more pixels. Have you seen raytracing?”

        The Steam Deck mostly gives an upper bounds for how much hardware a game should demand for the next few years, and it’s probably lower than some developers wanted it to be.

        The silliest thing about raytracing in particular is it was planned to be a developer convenience. So in an RTX-only future, we were all going to upgrade to much more powerful GPUs, only to run games that look about as good as what we already have.

        • YuzuDrink@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I absolutely love raytracing… and on my 3080 it just doesn’t look good enough yet to justify turning it on for most games. Maybe they just haven’t implemented it well yet, but the reduced framerate in most games just isn’t worth it, and I’ve hated effects like screen-space reflections since more or less they came out.

          I think by the time we have a 50X0 or a 60X0 that raytracing will finally be fast enough to have it look good AND perform well. But for now it’s mostly just a gimmick I turn on to appreciate, and then turn back off so I can actually play the game smoothly.

          • Pigeon@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            It might be that they’ll put more time and effort into getting it looking right once more people can run it at all, too. I’m not sure what percentage of PC gamers have sufficiently new/powerful GPU’s to run it, but I’d suspect it’s still small, and I’d think there’s only so much time and effort that devs will want to put into something that most people won’t see at all, when they could spend those resources for other aspects of the game (including other aspects of graphics) instead.

            The one thing I would really like now is better audio. Both stuff like better 3D positional audio (e.g. Deathloop if you turn that setting on - although the setting kept turning itself off for me, which was maddening) and more varied and complex sound effects and music. It can make a huge difference, even when people don’t consciously notice.

    • patchymoose@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Whatever happened with Intel’s discrete GPUs? I got whiplash trying to follow the news. At one point I thought the news was that they were discontinuing them altogether. But are they proceeding now?

      • mustyOrange@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Honestly, pretty damn well. If they keep with it, I see good things for them.

        Imo, the A770 is a lower mid end hero. They’ve really improved their driver support, and I think Battlemage is going to be great.

  • UprisingVoltage@feddit.it
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 year ago

    Personally I’m starting to buy second-hand hardware and I recommend it. Less pricey, more eco-friendly and less money in pockes of greedy corps

    • bootyberrypancakes@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Almost all of my computer hardware is second-hand! My plex server was free from a guy on reddit and I built a second gaming PC for the TV for maybe $100.

  • Carlos Solís@social.azkware.net
    link
    fedilink
    arrow-up
    16
    ·
    1 year ago

    Considering that both Nvidia and AMD have been constantly pushing the prices of baseline GPUs well beyond the golden standard of the 1060, even long after the Big Crypto Spike of 2020? Yeah, barely anyone would bother spending a small fortune on a GPU

    • Communist@beehaw.org
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      Not only that, but the used market is skyrocketing, which is just gonna push these numbers even lower.

      • Onihikage@beehaw.org
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 year ago

        There had to have been people in marketing that knew this would happen and were overruled by bean-counting executives. The top card of each generation outdoes the top of the previous gen, but for a couple of generations it’s been increasing in price in almost lock-step with the performance increase. Often the newer card will have worse VRAM than the previous generation’s equal-performing card because you’re looking at an older top-spec card vs a newer midrange, and the midrange cards always have less VRAM. With AAA games now starting to really want more VRAM in order to have better visuals, the older cards wind up actually being the better option long-term.

    • gumpy@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      barely anyone would bother spending a small fortune on a GPU

      well, except datacenters. can’t get enough of them and the datacenter card prices would make you cry.

    • GalaxyGamer@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      Then there is also buying used, which tends to be much cheaper than buying directly from either that it makes sense that their sales are falling off.

  • ghashul@feddit.dk
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    I’m still running a 1060 6gb card. I’ll keep it for as long as I can, and then I’ll likely upgrade to something that isn’t the newest generation at the time.

    • Jediotty@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I’ll probably use my 1070 till it dies, and after that if I’m able to fix it :)

    • snoopfrog@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Ditto. My 1660 super and 10th gen i5 run Diablo 4 and Lightroom smoothly. No need to upgrade until that’s not the case. It’ll be 3 years young in November.

    • I'm back on my BS 🤪@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      same. I’m on a 1060ti 6GB doing fine. My machine is at a point that it is maxed out, so I would need to build a whole new one. Still, I’m using Linux exclusively and lag a few years behind the latest games to save on money, so I’m no hurry. I’ll get there when the prices want to come down to my level.

  • jack@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    1 year ago

    Makes sense considering how bloody expensive they are in a time of economic shittery.

    I’d love to upgrade, but when I think about it logically there’s absolutely no point. My rig works fine for what I play right now, and the value just isn’t there in a new GPU for me.

    • averagedrunk@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      They got used to miners buying everything the second it came out and the global supply chain shitting all over everyone. So they price them super high.

  • ArtVandelay@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    1 year ago

    I bought a 4070Ti for $1k and I deeply regret it. Not because I can’t afford it, but because I let my want of gaming at 120 fps overpower my ethics of enabling a company to get away with these prices. It’s definitely a regret I have.

    • Behohippy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I paid $1100 for a 3070 during the pandemic with a newegg bundle deal (trash stuff they couldn’t sell). I already had a 2070 and it was a complete waste of money.

    • Reeek@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I feel that way too. My 2080 is still good so the itch isnt as strong but when I play something on my 4k TV and the fps dips below 60 the itch returns. I truly don’t want to buy anything from nvidia or amd even for a good while so here’s hoping Intel keeps at it and doesn’t get stupid expensive as well

  • Strawberry@beehaw.org
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    They’re just ridiculously overpriced. Yes yes yes I know I know pc gaming good but you’re literally spending the cost or more of a PS5 on a graphics card, it’s just not competitive in the gaming sphere. I know I’m not looking to upgrade any time soon

  • Triage8420@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    ·
    1 year ago

    Still rocking my 1070ti. I mostly play overwatch 2 and Minecraft so it works ok for me now. Also I’m broke and can’t afford the upgrade.

    • Derrek@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I had a 1070ti since 2018 and it has run everything I have purchased just fine.

      I thought about checking out this ray tracing stuff the kids are into, but is there a card under $300 that anyone recommends? It also would need to be mini itx as I have a tiny living room gaming PC.

      • Poke@beehaw.org
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Sorry but I’m not sure you’re going to get any good ray tracing experience for less than $300.

        AMD probably has the best general use GPU in that price range.

        Intel probably has the best (with a big asterisk due to driver and directx issues) gaming GPU in that price range.

        It’s just hard to recommend buying a GPU right now imo.

    • sailsperson@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      1080 here. I’m really happy with the decision I made years back. Some games are terribly not optimized, but that won’t make me cash out for a new piece of hardware.

      And anything that’s actually worth upgrading from my GPU is going to be even bigger and block the front panel pins on my new motherboard I was gifted last year. Yep.

    • Captain_Wtv@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Some AMD stuff is kind of cheap rn. You can get the 6700XT For around 340 USD. And it’s good performance wise I think. Granted you are on something where you don’t need it imo. I was on something a lot less powerful which is why I made the plunge.

  • Teali0@kbin.social
    link
    fedilink
    arrow-up
    12
    ·
    1 year ago

    I’m not surprised; I was killing some time at Micro Center yesterday and couldn’t believe how high the prices were. Shelves were pretty much completely full. I found better deals over in the laptops for GPUs. At least with that purchase you get a whole computer.

    • simple@kbin.social
      link
      fedilink
      arrow-up
      7
      ·
      1 year ago

      I found better deals over in the laptops for GPUs.

      Even that isn’t a good deal. Laptop GPUs are a lot weaker than their desktop counterparts, the 4070 (laptop edition) is LITERALLY a 4060 desktop edition. They’re misleading consumers into buying something worse than they’re expecting, and the prices are still outrageous for the new generation.

      • CynAq@kbin.social
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        It’s really shitty they are doing this. When desktops are providing way better price to performance ratio, they are trying to create the illusion that you can still get comparable performance with a slight increase in price, when in reality you are sacrificing a substantial amount of performance for mobility.

        If they made this part clear, I don’t think there would be any appreciable decrease in laptop sales too.

        Do you think anyone opts to buy a laptop when they have no absolute need for mobility?

        • Manticore@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Yeah, they do. A laptop is plug-and-play. It has all the things are computer needs already built into it. If you just want a thinking-rock to do your emails for you, a laptop is relatively painless. Then if something happens to it, you take it to a third-party to fix it up for you.

          Story time:

          I’m a desktop person, but my mother was always a laptop person. She was looking to replace her laptop on life support; I suggested to use desktop because she didn’t actually need the mobility, it was cheaper for the same power potential, it could be upgraded piecemeal and for cheaper, and she could get a larger screen (for her struggling vision).

          Desktops are so bulky, it takes up so much space on her desk. I also didn’t expect her needing several different peripherals that her laptop had built in (microphone, speakers, webcam, more USB slots). Yes it will be cheaper for her in the long-run even with those unexpected costs, those peripherals will work with any desktop tower, but she’s so frustrated with it she already wants to give up on it.

          I’m also stuck being her go-to person when she wants to complain. How the desktop is just so horrible she’s still trying to use her laptop anyway, how she may as well get rid of the desktop because it’s caused nothing but trouble and nobody will help her etc… she used to be complaining that the laptop was so horrible she’d have to buy a new computer, too. (tbh there’s no scenario where her life isn’t eternal suffering.)

          If she really does just give up on the desktop (very in the realm of possibility for her), I’m not going through this again. She can have her overpriced, low-output laptop. The bonus is that as a desktop person, I can deny any expectation of fixing it for her.

  • zzz711@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    Not surprising since the last gen was impossible to find due to crypto and the current gen is overpriced.

    • bubberstarteletscam@feddit.dk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      The only reasons I’d buy right now is

      1. I was suddenly rich and
      2. I wanted to have a private GPT

      Gaming is fine I guess on what I already have running.

    • Jordan Jenkins@lemmy.wizjenkins.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Not to mention them being HUGE physically and sucking down a ton of power. And the whole fire thing that hit the news that made a lot of people decide to wait.

  • zaktmt@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    1 year ago

    This doesn’t shock me. NVIDIA really saw the pricing for the resale market for GPUs and said “we can charge this too”

    No, you really can’t. Also, a lot more people anecdotally aren’t super aggressive with upgrades anymore. People are getting so much use out of their electronics that there really is no need to upgrade for most people unless something breaks.

  • GhostMagician@beehaw.org
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    They don’t understand average consumer looking to buy a desktop GPU is not the same as crypto miners looking to buy GPUs. Once crypto miners exited the market so did the main reason for an unusual number of units being sold leading to high prices in the first place, since to them it is a business expense.

    • vegemash@lemmy.one
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I think Nvidia at least have theirs eyes on the ml market. Theys just dont care about even the mid range. The decision to not put a decend amount of vram on these cards serms like a deliberate move to prevent them running many ML workloads.

  • setInner234@feddit.de
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    My hopes are on Intel as viable 3rd party. Looks like AMD and Nvidia have agreed they can fleece customers for piss poor perf/price improvements.

    • TheTrueLinuxDev@beehaw.org
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      With absurd capability of A770, yeah, it beat out on prices so if Intel release a 32 GB VRam GPU on their next round of offering, it’ll crush both AMD and Nvidia if the price is somewhere around $350 and $450.

      (For those wondering, A770 offers about 1/2 as much FLOP performance as 4090 RTX and it have 16 GB of VRAM for the price of $350. That’s insane.)

      If Intel is trying hard, then they could opt to skip the 32 GB VRam and go straight for 64 GB VRam and crush both AMD and Nvidia as a competition, because at that point, they can eat both the consumer and business GPU market share.

    • citrixscu@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Yes, and it is quite disappointing. I am hopeful that Intel does well with Battlemage as a potential option.

  • honk@feddit.de
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    The managers didn’t realize that the high demand from the crypto explosion and covid 19 lockdowns is over. They didn’t notice it coming probably still high on coke stroking their …fat bellies. Large parts are seriously affected by the shitty economic situation because of a major war and the long term effects of the covid crisis. People are struggling while the food trading industry is mercilessly making record profits.

    Of course people are not going to invest 50% of a monthly income on a gpu that is only marginally better and their trusty old 1060 is still good enough. The industry is completely out of touch with consumer needs and what they can afford and are willing to spend.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      All the tech companies are so stupid for thinking COVID numbers were real. Like restaurants dipping to near zilch, the inverse was also temporary. No doordash, your model only worked during COVID when we couldn’t go to restaurants. I’m not ordering a nice sit down meal in anymore. I’m going to go get my fastfood. Don’t act all shocked that the money I spent was permanent.

      Same with Amazon, GPUs, streaming services, all the things that flew high during COVID. How did you not see the dip coming?

      • MerylasFalguard@kbin.social
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        Because capitalism. Everything is set up on the literally-impossible goal of continuous, unending growth. Lots of shitty decisions get made on the assumption that you can always be continuing upward, even when you literally can’t anymore. Have one quarter where things dip a bit and it could be the end of you as the investors jump ship over it.

        • Echolot@sh.itjust.works
          link
          fedilink
          arrow-up
          8
          ·
          1 year ago

          I’m not sure that capitalism per se is the problem here and more so that the entire way modern, especially tech companies, are funded is just stupid.

          Present a „new“ concept that nobody has done before. Rake in cash by showing investors your user numbers. Try to actually start making money by squeezing your users. Fail. Maybe because your concept wasn’t working economically from the beginning and that’s why nobody else has done it before.

  • dawnerd@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    Well maybe if they weren’t so power hungry and expensive people would upgrade.