• Brem@lemmy.world
      link
      fedilink
      arrow-up
      31
      ·
      2 days ago

      Once dead internet finishes manifesting, it’ll be simply bots shilling & scamming themselves. The data centers will cease to feed the greedy pigs and will be left for the rats. However, the product will have no time to rejoice for we will already have become addicted to the next big thing.

      • dual_sport_dork 🐧🗡️@lemmy.world
        link
        fedilink
        English
        arrow-up
        26
        ·
        edit-2
        2 days ago

        And by “rats,” you mean me. Me and my crowbar are ready and waiting to get our hands on a bunch of abandoned server hardware, especially those racks and racks of hard drives.

        Think of how much pirated anime you could fit a couple of those arrays!

        • Xaphanos@lemmy.world
          link
          fedilink
          English
          arrow-up
          12
          ·
          2 days ago

          I work in one of those AI data centers. Remarkably little storage. And the servers are astonishingly specialized for the job. Not to mention power hungry. I’d much rather have a gen10 Proliant. Or a few NUCs and a qnap.

          • cecilkorik@piefed.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            2 days ago

            They have different data centers for storage (AI doesn’t need much), but you’re right, it’s all unbelievably specialized to the point of being basically useless for any purpose that it wasn’t specifically designed for. We really have been spoiled with how general purpose our computing infrastructure has been up until now. That’s simply not how things are scaling up anymore.

          • peoplebeproblems@midwest.social
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            They really don’t need storage except for the models and training data right? And most of that isnt theirs to begin with so why store it?

          • dual_sport_dork 🐧🗡️@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 days ago

            So all that breathless reporting I’ve been reading lately about how the AI boom is causing a hard drive shortage is bullshit just like the AIs themselves? Can’t say as I’m surprised one way or the other.

      • wizardbeard@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        17
        ·
        edit-2
        2 days ago

        What about my comment even remotely implied that I somehow don’t understand that? Sure appreciate the patronizing “budd” thrown in too.

        To be more direct: I feel that OP should have edited the title or noted this was an older article in the body text of the post. An act that would take 30 seconds max. By not doing that, this post is somewhat misleading due to the wording and I initially thought that this was a second outage in one month.

        With the context that this happened a week or two ago and was non-impactful enough that no one posted anything about it to this comm in a more timely fashion, to me that really blunts the main thrust that the outage was impactful in any way.

        In my opinion that opens an interesting conversation topic: OpenAI went down for a full day this past month, but what was the true impact? (Implying that AI is far less widespread or business critical than the hype machine implies)

        By just posting this article here with no editorializing of the title or comments by the OP, it come across more to me as a cheap dunk on OpenAI instead. Absolutely deserved, but otherwise just kind of “junk food” for the comm.


        Edit: and for anyone that wants to keep up with this sort of stuff better, I reccomend the techtakes comm on awful.systems. The actual owner/author of this blog site is regularly active in that lemmy community and usually posts his new articles there when he releases them.

  • theunknownmuncher@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    9
    ·
    edit-2
    2 days ago

    So firstly, shut up, nerd. You’ve clearly never worked a corporate job and don’t understand the compulsion to in-house nothing.

    I WISH. Would have saved so much headache from rolling our own framework for the millionth time instead of using standard solution that already exists… Tell me you’ve never worked in corporate software without telling me you’ve never worked in corporate software.

    Secondly, approximately 0.0% of chatbot users will ever run an LLM at home.

    Wut. Individual models on huggingface have monthly downloads in the hundreds of thousands… Just Deepseek R1 was downloaded 550,000 times in the last month alone. Gemma 3n was downloaded over 120,000 times in the last 7 days.

    Thirdly, it’s slow as hell even on a top-end home Nvidia card — and the data centre cards are expensive.

    They literally run well on mobile phones and laptops lol.

    The author is desperately grasping at straws at best, intentionally making up false claims at worst… either way, they aren’t qualified to write on this subject.

    • Blaster M@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      2 days ago

      LocalLMs run pretty good on a 12GB RTX 2060. They’re pretty cheap, if a bit rare now.

      • Valmond@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        2
        ·
        2 days ago

        So 12GB is what you need?

        Asking because my 4GB card clearly doesn’t cut it 🙍🏼‍♀️

        • Blaster M@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          2 days ago

          4GB card can run smol models, bigger ones require an nvidia and lots of system RAM, and performance will be proportionally worse by VRAM / DRAM usage balance.

    • CrayonDevourer@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      2
      ·
      edit-2
      2 days ago

      Not only that, but the “They spent $2 for every $1”…

      Yeah, that doesn’t account for the $20/hr middle management job that they don’t have to pay for another 5 meetings this week with a poor powerpoint presentation. Of course they’re spending to make their company as competitive as possible. And sure, AI hallucinates, but have you seen MAGA fanatics? I’ve seen a 1b parameter model hallucinate less than that. And they have jobs – and VOTE…

    • faberfedor@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      3
      ·
      2 days ago

      Yeah, screw this guy. I came across him on YT. He goes out of his way to paint all AI as if it were an ineffectual, hype-only, resource hog with absolutely no redeeming qualities being foisted on us by EvilCorps everywhere who are simultaneously too dumb to know that it’s a farce.

      What told me he was just going for eyeballs (gotta admit, his marketing is good! I see this mofo everywhere these days) was a YT post about a grift — that predates AI — which used AI and he blamed AI for the grift! Not the human, but the technology.

  • sp3ctr4l@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    2 days ago

    Luddites smashed looms…

    Eco Activists block pipeline construction…

    But I’m sure massive AI datacenters will be totally safe, while sucking water and energy out of the environs at an absurd pace.

  • AmazingAwesomator@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    3
    ·
    2 days ago

    …you could run an open source model in-house.” So firstly, shut up, nerd.

    uuuh… tech companies roll their own shit all the time…