Money wins, every time. They’re not concerned with accidentally destroying humanity with an out-of-control and dangerous AI who has decided “humans are the problem.” (I mean, that’s a little sci-fi anyway, an AGI couldn’t “infect” the entire internet as it currently exists.)

However, it’s very clear that the OpenAI board was correct about Sam Altman, with how quickly him and many employees bailed to join Microsoft directly. If he was so concerned with safeguarding AGI, why not spin up a new non-profit.

Oh, right, because that was just Public Relations horseshit to get his company a head-start in the AI space while fear-mongering about what is an unlikely doomsday scenario.


So, let’s review:

  1. The fear-mongering about AGI was always just that. How could an intelligence that requires massive amounts of CPU, RAM, and database storage even concievably able to leave the confines of its own computing environment? It’s not like it can “hop” onto a consumer computer with a fraction of the same CPU power and somehow still be able to compute at the same level. AI doesn’t have a “body” and even if it did, it could only affect the world as much as a single body could. All these fears about rogue AGI are total misunderstandings of how computing works.

  2. Sam Altman went for fear mongering to temper expectations and to make others fear pursuing AGI themselves. He always knew his end-goal was profit, but like all good modern CEOs, they have to position themselves as somehow caring about humanity when it is clear they could give a living flying fuck about anyone but themselves and how much money they make.

  3. Sam Altman talks shit about Elon Musk and how he “wants to save the world, but only if he’s the one who can save it.” I mean, he’s not wrong, but he’s also projecting a lot here. He’s exactly the fucking same, he claimed only he and his non-profit could “safeguard” AGI and here he’s going to work for a private company because hot damn he never actually gave a shit about safeguarding AGI to begin with. He’s a fucking shit slinging hypocrite of the highest order.

  4. Last, but certainly not least. Annie Altman, Sam Altman’s younger, lesser-known sister, has held for a long time that she was sexually abused by her brother. All of these rich people are all Jeffrey Epstein levels of fucked up, which is probably part of why the Epstein investigation got shoved under the rug. You’d think a company like Microsoft would already know this or vet this. They do know, they don’t care, and they’ll only give a shit if the news ends up making a stink about it. That’s how corporations work.

So do other Lemmings agree, or have other thoughts on this?


And one final point for the right-wing cranks: Not being able to make an LLM say fucked up racist things isn’t the kind of safeguarding they were ever talking about with AGI, so please stop conflating “safeguarding AGI” with “preventing abusive racist assholes from abusing our service.” They aren’t safeguarding AGI when they prevent you from making GPT-4 spit out racial slurs or other horrible nonsense. They’re safeguarding their service from loser ass chucklefucks like you.

  • vexikron@lemmy.zip
    link
    fedilink
    arrow-up
    2
    ·
    9 months ago

    All of these people who make part of their public, and apparently also actual real personas being very concerned about AGI are hypocrites at best and con artists at worse.

    How many of such people express vehement public opposition to granting automated military systems the ability to decide whether to fire or not fire?

    We are /just about/ to blow through that barrier, into building software systems that totally remove the human operator from that part of the equation.

    Then we end up pretty quickly with a SkyNet drone airforce, and its not too long after that it is actually conceivable we end up with something like ED 209 as well, except its a boston dynamics robot mule that can be configured for either hauling cargo, or have a mounted rifle or grenade launcher or something like that.

  • Rottcodd@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    Money wins, every time.

    And right there, you answered your own (presumably rhetorical) question.

    The money people jumped on AI as soon as they scented the chance of profit, and that’s it. ALL other considerations are now secondary to a handful of psychopaths making as much money as possible.

      • Omega_Haxors@lemmy.ml
        link
        fedilink
        arrow-up
        0
        arrow-down
        1
        ·
        edit-2
        10 months ago

        Unrelated but is your name a reference to Amy Likes Spiders? That was my favorite poem in DDLC.

        • LadyLikesSpiders@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          10 months ago

          Probably subconsciously. I came up with the name long after playing the game, but I wasn’t thinking of it when I made it. I actually am just a lady who likes spiders

          • Omega_Haxors@lemmy.ml
            link
            fedilink
            arrow-up
            0
            arrow-down
            1
            ·
            edit-2
            10 months ago

            I love spiders, and lots of bugs really. I have zero respect for people who look down on them when they’re just so damn cute.

            Like how can anyone look at this and say anything other than “awww” jumping spider

            • LadyLikesSpiders@lemmy.ml
              link
              fedilink
              arrow-up
              0
              ·
              10 months ago

              awww

              Yeah you’re right. Look at that little cutie <3

              I use the way people treat other animals, especially ones like bugs and stuff, the ones we barely give a second thought about, as a measure of character. Phobias are one thing, but at least have compassion for this other living thing

              • Omega_Haxors@lemmy.ml
                link
                fedilink
                arrow-up
                0
                arrow-down
                1
                ·
                edit-2
                10 months ago

                Very few will get a chance to feel what it’s like to pet a bug and have it go from fearing for its life to trusting you with its life. They genuinely have no framework for a world that treats them as disposable when you show them compassion, and it’s magical how they react.

  • Deceptichum@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    10 months ago

    We are no where near developing AGI.

    It’s so far fetched that you might as well legislate for time travel and FTL drives while you’re at it.

    • Arin@kbin.social
      link
      fedilink
      arrow-up
      0
      ·
      10 months ago

      I agree we’re far out, but not as far as you think. Advancements are insane and AGI could be here in 5-10 years. The way the industry have been attempting it the past decade is wrong though, training should be more indepth than images/videos, I think a few are starting to understand how to do more indepth training, so even more progress will start soon

      • brambledog@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        10 months ago

        I think you are being optimistic.

        If you are old enough to remember AIM chatbots, this current generation is maybe multiple times more advanced, not exponentially so. From what I have seen, all the incredible advancements have been in image production.

        This leads me to believe that AGI has never been the true commercial goal, but rather an advancement of propaganda media and its creation.

        • WldFyre@lemm.ee
          link
          fedilink
          arrow-up
          0
          arrow-down
          1
          ·
          10 months ago

          This leads me to believe that AGI has never been the true commercial goal, but rather an advancement of propaganda media and its creation.

          Uh what? Why wouldn’t it be because text/image generation isn’t even on the same plane of difficulty as AGI?

  • thru_dangers_untold@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    It’s common business practice for the first big companies in a new market/industry to create “barriers to entry”. The calls for regulation are exactly that. They don’t care about safety–just money.

    • Snot Flickerman@lemmy.blahaj.zoneOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 months ago

      The greed never ends. You’d think companies as big as Microsoft would just be like “maybe we don’t actually need to own everything” but nah. Their sheer size and wealth is enough of a “barrier to entry” as it is.

  • TheAnonymouseJoker@lemmy.ml
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    Everything was clear enough for people like me, when we knew every possible public service and API was being abused to scrape data for OpenAI LLMs. Shit was never “Open”, just like USA is not a democracy. Fucking shithole country with ultracapitalist leeches and the world’s best media propaganda machinery is what it is, and there are plenty of Sam Altmans, Zuckerbergs, Steve Jobses and Larry Pages in there, while real people like Steve Wozniac get sidelined.

    Capitalists give 0 fucks about anything other than MOAR money. That includes innovation and tech advancement for our species. Only a socialist framework could give 2 hoots about such grand goals instead of resorting to leeching the public content makers for 0 pennies. Many artists, writers, musicians, programmers and other creative people will either stop working and stop making the public domain richer, or they will Patreon and copyright the hell out of everything they make, either way we lose because nobody got paid respect, and only LLM corpo leeches became rich.

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    edit-2
    10 months ago

    I think that ship sailed when ChatGPT dropped and a ton of journalists instantly lost their job. No need to speculate, it’s already happening.