• Asmodeus_Krang@infosec.pub
      link
      fedilink
      English
      arrow-up
      72
      ·
      8 days ago

      It’s truly mental. I don’t think I could afford to build my PC at the same spec today with RAM and SSD prices being what they are.

      • tempest@lemmy.ca
        link
        fedilink
        English
        arrow-up
        44
        ·
        8 days ago

        I have 128 GB of ddr5 memory in my machine. I paid 1400 for my 7900xtx which I thought was crazy and now half my ram is worth that.

        Never thought I would see the day where the graphics card was not the most expensive component.

          • tempest@lemmy.ca
            link
            fedilink
            English
            arrow-up
            8
            ·
            8 days ago

            I should not have even gotten the 128.

            I can use it but barely at 4600 because ryzen chips can’t handle 4 dimms of 32gvb.

            I honestly didn’t even bother to check at the time of purchase and it is is still a roll of the dice if I restart.

    • samus12345@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      47
      ·
      8 days ago

      Just about all electronics older than a year or so have. Even a Switch, which came out 9 years ago, costs more to buy now than it did then!

        • fartographer@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          Certainly the most powerful GPU I’ve bought! I got mine 2 years ago for about $120, and it’s been great for my docker containers!

  • blitzen@lemmy.ca
    link
    fedilink
    English
    arrow-up
    122
    arrow-down
    1
    ·
    8 days ago

    Apple over here not raising their RAM prices because they’ve always been massively and unjustifiably inflated. Now, they’re no longer unjustifiably inflated.

    • mushroommunk@lemmy.today
      link
      fedilink
      English
      arrow-up
      57
      arrow-down
      2
      ·
      8 days ago

      I dunno. “AI companies bought literally everything” seems like an unjustifiable reason still.

      • blitzen@lemmy.ca
        link
        fedilink
        English
        arrow-up
        42
        ·
        8 days ago

        Perhaps. I guess my point is they no longer are as out-of-line with the rest of the market. Comment meant as a backhanded “compliment” toward Apple.

    • totesmygoat@piefed.ca
      link
      fedilink
      English
      arrow-up
      13
      ·
      8 days ago

      They also buy allotments months in advance. Just waiting to see how much apple will charge soon.

  • Jhex@lemmy.world
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    3
    ·
    8 days ago

    This article sucks… I think they felt the need to excuse AI lest they upset corporate masters

    While it’s easy to point the finger at AI’s unquenchable memory thirst for the current crisis, it’s not the only reason.

    Followed by:

    DRAM production hasn’t kept up with demand. Older memory types are being phased out, newer ones are steered toward higher margin customers, and consumer RAM is left exposed whenever supply tightens.

    Production has not kept up with demand… demand being super charged by AI purchases

    …newer ones are steered towards higher margin customers… again AI

    consumer RAM is left exposed whenever supply tightens… because of AI

    • AeonFelis@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      8 days ago

      You see, it’s easy to blame AI data centers buying all the RAM - but that’s only half the story! the other half of the story is manufacturers selling to these data centers

      • garretble@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        8 days ago

        Honestly, my PC at this point plays FFXIV and that’s basically it. And I’m OK with that.

          • eli@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            8 days ago

            Microsoft tried that with TPM. Which you can bypass for the most part with Rufus and a clean install. Still some kernel anticheat games you can’t do so so easily.

            I’ve already switched over to Linux, just got one more system to migrate. So far 100% worth it to not deal with Microslop.

            • WorldsDumbestMan@lemmy.today
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              8 days ago

              What I am truly afraid of, is they somehow get ahold of Linux, and force slopware onto it, including that EoL bs. Keep offline devices as backup.

              • eli@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                8 days ago

                Uh that can’t happen with Linux. The closest you can look at is Red Hat, which doesn’t affect distros from Debian or Arch.

      • Wildmimic@anarchist.nexus
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 days ago

        I agree. I recently swapped out my aging 2600x6core for a 5950x32core processor and upgraded from an 3070ti to a 5070 (well actually more of a sidegrade - my vram was simply too small). before this, my system had already a few years where there wasn’t much difference regarding gaming - In the current configuration and the glacial speed gaming developed i’d say i have a decade before upgrades are really needed.

      • MonkderVierte@lemmy.zip
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        7 days ago

        Right? My APU can run almost all games up to 2020 in 3k. Not high-fps but i’m not sensible to that.

        Except a few, like Valheim and Empyrion, which have 1 fps on the menu.
        Do they require some special instruction sets or something, that a APU can’t handle?

    • eli@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      8 days ago

      Same here, running my 3700X with a 3080.

      I should’ve pulled the trigger on the 9800X3D last year like I wanted, but thought it was just too expensive.

      Welp.

  • rogsson@piefed.social
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    1
    ·
    8 days ago

    When the yet-to-be data centers never get built because AI slop bubble pops, we will be able to build houses out of RAM sticks for the poor

    • veni_vedi_veni@lemmy.world
      link
      fedilink
      English
      arrow-up
      11
      ·
      8 days ago

      the problem with data center hardware is that they are often bespoke and nowadays can’t be reused in a consumer context. Think about those headless GPUs, they probably making these RAM modules with a different interface.

      They will just be e-waste instead of having the possibility of being surplus.

      • Tja@programming.dev
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        The modules yes, but ram is bought on the chip level. If the modules are never built, the chips can be reused in normal dimms.

        Worst case we get a new HBM dimm format :D

      • Paranoidfactoid@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 days ago

        Those headless GPUs are great for simulation work in blender and other creative tools. I’d love an opportunity to buy a good used one on the cheap.

  • LoafedBurrito@lemmy.world
    link
    fedilink
    English
    arrow-up
    45
    arrow-down
    1
    ·
    7 days ago

    Ruining the PC market for consumers on purpose so people will think it’s cheaper to rent computers than to own.

    In the future, you will lease your computer and not own it, just as you are told to do by the billionaires who steal your pay.

    • Exatron@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 days ago

      I remember when people claimed that Socialism meant people would own nothing and like it. Turns out, they were actually describing capitalism. Granted, I don’t like one bit of this.

    • SourGumGum@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      7 days ago

      In the future you will connect to a corporate owned terminal and use an online hosted OS, where your files are kept in their cloud ecosystme.

    • BanMe@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      7 days ago

      Yep cloud providers definitely came up with the AI boom in a roundabout conspiracy to end PCs. Total direct chain there.

  • Zarajevo@feddit.org
    link
    fedilink
    English
    arrow-up
    44
    arrow-down
    1
    ·
    8 days ago

    US oligoarchys wants to have all computation done in their warehouse so they have to power to change any computation at any time

    • MadBits@europe.pub
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      1
      ·
      7 days ago

      When I first saw “GeForce Now” that’s exactly what I imagined. Building a market for cloud computation. “Just own the display, we will rent you the brain for it”. Currently they are choking the market with orders for ram that does not exist, paid with money that does not currently exists and the output of this situation is that prices go up artificially which will eventually drive the users exactly in a scenario where they will rent out computing power “for cheap” to play that latest game for 2-3 hours.

      • hark@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        Thanks! I was blown away by the quality of voice work in a game back then. Combined with the story, it was a real treat.

        • trongod_requiem0432@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 days ago

          I only watched a playthrough to be honest, because the game wasn’t available for my pc and the gameplay seemed kinda outdated, but fuck yeah! The story and voice work really rocked! I loved it as well. Don’t get me started on OST. Quite the movie material in my opinion.

    • njordomir@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      7 days ago

      Me too, I added more than I could use because today’s gaming rig is tomorrow’s server. Now I’m debating if I should sell a few sticks but who knows when, if ever, I’ll be able to replace them.

    • mlg@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 days ago

      I did my desktop but skipped my server.

      Even decade+ old used surplus server DDR4 didn’t escape the apocalypse.

    • hdsrob@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 days ago

      Same … I hadn’t upgraded since 2012, and had some extra cash, so rebuilt in August. Feeling pretty lucky to have done it then, and really glad I went ahead and put 64GB RAM in it.

        • hdsrob@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          Yea, the 2012 build was a 3770k with 16 gb ram, multiple SSDs, a GTX680, etc. So it was a pretty fast machine back in the day.

          I upgraded the video card and SSD drives several times, just didn’t have the budget to replace it all at once for a long time.

      • hdsrob@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 days ago

        Minus the case and video card, I have an entire 3rd gen i7 machine sitting in a box that would actually make a pretty good machine for a lot of different uses.

      • 9point6@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        8 days ago

        I more meant now they’re not being made because Micron recently killed the Crucial brand to focus supply towards data center customers

        • Prove_your_argument@piefed.social
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          8 days ago

          I’m well aware, but everybody knows the HBM demand will dry up eventually and that eventually the consumer market will be worth trying to profit from again.

          They just want to manipulate the consumer market to maximize margins. If they can get memory prices to stay at 200-300% for a while, they can up the prices they charge and raise margins to stratospheric heights not before seen on the consumer market. All manufacturers jump on stuff like this when they can get away with it.

          Memory manufacturers still order from micron directly for their own branded chips. Those margins will increase for all parties. Ai data center demand is like Christmas for the entire industry. No pricing is transparent and every vendor is profiteering.

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    1
    ·
    8 days ago

    Im on Linux and it requires just as much memory as it did in 2018. No problem here.

    • pHr34kY@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      8 days ago

      I upgraded mine from 16GB to 32GB two years ago beacuse RAM was cheap. I didn’t really need it, and have probably never hit 16GB usage anyway.

      Meanwhile my work Windows laptop uses 16GB at idle after first login.

      Windows has always been wasteful computing, and everyone just pays for it.

      • Waraugh@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 days ago

        Storing data in ram isn’t wasteful though, I have a lot of criticisms of windows but the memory management isn’t one of them. I’d rather have as much predictive content be staged in ram as possible as long as it’s readily dumped out if I go to do something else, which is my experience. Like I don’t earn interest for having unused RAM on my computers (for reference I have an endeavorOS, rhel, fedora, and windows computers under my desk connected to a dual monitor kvm right now; it isn’t like I don’t regularly use/prefer Linux; I mostly access my windows machine via rustdesk for work related stuff I don’t feel like having to dick with on Linux like the purchase order system and Timecard system), I just don’t get this critique.

      • Bilb!@lemmy.ml
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 days ago

        Requiring less RAM is good, but conceptually, it’s Linux that is “wasting” the RAM by never using it. It’s there, and it’s reusable, fill it up! Now, does Windows make good use of it? No idea. Wouldn’t bet on it, but I could be surprised.

      • NotMyOldRedditName@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        8 days ago

        I wish I had a 32gb ram laptop.

        I can have 3 development IDEs open at once, and with all the browser tabs open and a few other programs here and there its stretching the limits on my Mac.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          arrow-down
          1
          ·
          8 days ago

          I have 32GB on my Windows PC laptop it can’t do three at once.

          Running the backend (java) and the frontend (react native) in intellij uses 29GB RAM, so I must run Android on real hardware over ADB+USB. Running an android simulator pushes it over the edge.

          Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery. It might as well be a NUC.

              • morriscox@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                7 days ago

                You might want to use Process Manager or the like to see if something is pegging the CPU/GPU. What is the model?

                • pHr34kY@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 days ago

                  I’m fairly sure it’s all the antivirus, which in itself is a kernel rootkit. The whole laptop is getting replaced in a month so there’s not much point in fixinig it.

          • NotMyOldRedditName@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            8 days ago

            Ya, macs are definitely more efficient with their ram.

            I’ll have Android Studio open for my main work, Intellij Idea for all the backend work, and Xcode when I need to tweak some iPhone things. (edit: usually it’s just 2 of the 3, but sometimes its all 3)

            I also mainly use real devices for testing,and opening emulators if all 3 are open can be a problem, and it’s so annoying opening and closing things.

          • GenosseFlosse@feddit.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            8 days ago

            Also: Laptops are shit. On Windows, the tau is so bad that the cores are throttled straight after boot because the cooling is rubbish. It almost never hits full speed. It can’t survive more than 40 minutes on a full battery.

            That’s the reason I have not bought a new laptop in years. Everything must be as thin as possible because apple did it. Fuck that. I want my laptop as thick as a brick to have enough cooling for CPU, GPU and a 6l V8 engine, and a battery that will outlast the sun!

            • RisingSwell@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              8 days ago

              Does Clevo still make the fat laptops? My last was one of theirs and it was almost as thick as my fore arm. It also weighed a ton but on the plus side insanely easy disassembly. I probably should’ve got another one, my MSI is shit to open

          • Honytawk@feddit.nl
            link
            fedilink
            English
            arrow-up
            2
            ·
            7 days ago

            What does Windows have to do with cooling? That is a hardware problem.

            You’d have the same issues if you installed Arch on it.

            Stop blaming Windows for all your problems.

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          13
          ·
          8 days ago

          Linux doesn’t waste RAM. All unused RAM becomes a disk read cache, but remains available on demand.

          • KairuByte@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            4
            ·
            8 days ago

            Cool, so the thing you stated wasn’t what happened, and you’re correcting me for not fact checking your comment.

      • Honytawk@feddit.nl
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        Seems like you (or your company) installed loads of bloat.

        You do know you can disable programs from starting during boot, right?

        • pHr34kY@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 days ago

          I fired up procexp and found nessus scanning every file on my drive. I found the SentinelOne had written 10GB of logs since start. I found some bullshit dell service was slamming the CPU. It’s all shit that my company put there.

          I found that it adds an extra 7 minutes to a 12 minute build when I compile my project, compared to doing it on WSL. The Windows bloat is insane.

      • jaykrown@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        7 days ago

        To be clear, that doesn’t mean AI is going away. It just means no one is actually going to pay for AI models anymore because open-weight free models will be extremely cheap and powerful.

        • iglou@programming.dev
          link
          fedilink
          English
          arrow-up
          9
          ·
          7 days ago

          It also means that AI in places where it brings nothing and in many cases makes the product actually worse will disappear

        • bthest@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          7 days ago

          How is anyone going to run those powerful models locally once the necessary hardware is unaffordable/unobtainable?

          No one is paying for AI because no one wants it to begin with.

          • jj4211@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            7 days ago

            In this scenario, you have a number of these AI companies contributing to the hoarding having their equipment handled through ‘asset recovery’, which means that at least companies that can drive 15kw to a system and water cooling will probably get them on the cheap and run it on their premise, or in a colo. Maybe some of those parts will trickle down, but admittedly a good chunk of the stuff is hard to accommodate in a residential setting.

            Longer term, the hardware becomes obtainable as supply chains re-calibrate back to identcal or more close solutions. Ten years ago, a datacenter GPU was likely to be same hardware as consumer, but with a different thermal solution, firmware, and the video ports unpopulated. The AI rush has made them shift to exotic packaging so they can have absurdly unreasonable wattage in small places that doesn’t work in home settings. I anticipate a swing back that way eventually.

      • MadBits@europe.pub
        link
        fedilink
        English
        arrow-up
        3
        ·
        7 days ago

        I really hope so but I can’t help but to think that they are going to drag it for as long as possible, because no matter how bad the situation is for the common folk, they are still going to make a profit off of it.

        • ulterno@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          arrow-down
          1
          ·
          7 days ago

          Yeah. The current job market shows a different trend.

          With people saying “people using AI will replace people not using AI”.
          And the reason for that is not because people using AI will produce better work, but because AI usage will be preferred over usable output.

          And because the flow of money is such that all those having money can easily choose to give most of it to AI users[1] while non-AI users don’t have the same ability once most providers turn to AI use.

          Now as long as you get the Governments on board (they are already buying up GPUs too) you can get all taxes into AI use, hence starving the non-AI market of ability to procure computing hardware (or anything else to work on) and that is how you get AI supremacy without providing anything better.


          1. with the only exception being base material products like agricultural produce, which has a much lower margin and their costs again go to AI users ↩︎

            • ulterno@programming.dev
              link
              fedilink
              English
              arrow-up
              0
              arrow-down
              1
              ·
              7 days ago

              I’m an AI user, where’s my money?

              While I was actually looking for someone to break my model with a better one with better info, this is a pretty useless argument.

              To get money, you need to:

              1. make some half-baked product with chutzpah
              2. Use charisma to get others to vibe with it on a social media platform
                • this is easy if you know how to do the social acting that “normies” do
              3. Appeal to the ones in power, while convincing them that you used AI to make it, will be using more AI and will do some subscription stuff
              4. Find a way to inherit Venture Capital.