A few colleagues and I were sat at our desks the other day, and one of them asked the group, “if you were an animal, what animal would you be?”

I answered with my favourite animal, and we had a little discussion about it. My other colleague answered with two animals, and we tossed those answers back and forth, discussing them and making jokes. We asked the colleague who had asked the question what they thought they’d be, and we discussed their answer.

Regular, normal, light-hearted (time wasting lol) small talk at work between friendly coworkers.

We asked the fourth coworker. He said he’d ask ChatGPT.

It was a really weird moment. We all just kind of sat there. He said the animal it came back with, and that was that. Any further discussion was just “yeah that’s what it said” and we all just sort of went back to our work.

That was weird, right? Using ChatGPT for what is clearly just a little bit of friendly small talk? There’s no bad blood between any of us, we hang out a lot, but it just struck me as really weird and a little bit sad.

  • vala@lemmy.world
    link
    fedilink
    arrow-up
    17
    ·
    1 day ago

    This is a perfect example of LLM brain rot. They are so used to outsourcing their thinking to an LLM that it’s now just their default way of thinking.

    • Natanael@infosec.pub
      link
      fedilink
      arrow-up
      8
      ·
      1 day ago

      There’s past evidence that the brain essentially outsources whole categories of knowledge and memories and skill to its surroundings.

      You might get good at certain things and learn certain things, somebody else learns something else, and then you both learn roughly what the other knows, at which point you rely on them for questions specific to what they know, and they rely on you for your specialty.

      We do this with technology too (it’s a big part of skills involving tools), and people has been doing it with dictionaries, online searches, etc.

      But doing it so universally for everything, just because chatgpt can form answer-shaped text for anything, is just insane. Don’t you even want to have your own personal feelings and thoughts? Do you just want to become an indirect interface to a bot for other people?

      It’s like the kind of personality-less people who mold themselves after popular people around them, but they’re doing it with an algorithm instead…

    • Doom@ttrpg.network
      link
      fedilink
      arrow-up
      4
      ·
      1 day ago

      I’m seeing this at work often when people need to write emails and shit. It’s depressing

  • starelfsc2@sh.itjust.works
    link
    fedilink
    arrow-up
    9
    ·
    1 day ago

    You know them better than I do but this is probably something I would’ve done when I was younger to be like “look I’m giving an unexpected answer!” and then as it plays out be like “oh god I ruined the conversation.” If that’s the case they will never do it again and feel unbelievably cringe lol.

  • pyre@lemmy.world
    link
    fedilink
    arrow-up
    52
    arrow-down
    1
    ·
    2 days ago

    this is not just friendly small talk, but questions like this are aimed to make people talk about themselves, in a way tell other people what kind of person they are. what superpower you’d have, what animal you’d be, what you would do with a million dollars, what one book/album you would take to an island to read/listen to forever…

    these don’t have a right answer and they reveal something about the people discussing it. asking a machine like it’s some puzzle to solve is extremely fucking weird. the lengths people go to just not to use their noggin is concerning.

  • Nightsoul@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 day ago

    I can maybe see it as a fun little thing like, oh let’s also ask chat gpt, but only if they had also given an answer for themselves.

  • prettybunnys@sh.itjust.works
    link
    fedilink
    arrow-up
    30
    arrow-down
    7
    ·
    2 days ago

    There is a lot of novelty in “let’s ask the thing” and always has been.

    Magic 8 ball is one sillier example that comes to mind.

    But asking Siri dumb shit, asking Alexa dumb shit.

    Now if they used ChatGPT instead of having their own original thoughts … weird.

    Maybe they’re uncomfortable in that situation and just wanted to add a novel response.

    To your point, yeah it’s weird, but it doesn’t have to be.

    • KingPorkChop@lemmy.ca
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 days ago

      Magic 8 ball is one sillier example that comes to mind.

      Don’t trash talk the 8-ball. It knew all about Microsoft Outlook was before Outlook was even a thing. The 8-ball is prophetic.

    • Rayquetzalcoatl@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      1
      ·
      2 days ago

      That was them using ChatGPT instead of having their own original thoughts, wasn’t it? That’s what struck me as so weird.

    • HugeNerd@lemmy.ca
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      Oh I am greatly entertained by asking various AIs “which animal has the most anuses” etc

        • HugeNerd@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          2 days ago

          The animal with the most anuses is the marine worm Ramisyllis multicaudata. This worm has a branching body structure, with each branch ending in a separate anus, resulting in hundreds of anuses.

          I giggled like a simpleton at “resulting in hundreds of anuses”. Guess what I asked here

          The question is a bit misleading, as most mammals have only one scrotum. However, when discussing the animal with the largest testicles relative to its body size, the tuberous bush cricket (Platycleis affinis) stands out. Their testes can account for up to 14% of their body weight, according to BBC Earth Explore.

          • KingPorkChop@lemmy.ca
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            The animal with the most anuses is the marine worm Ramisyllis multicaudata. This worm has a branching body structure, with each branch ending in a separate anus, resulting in hundreds of anuses.

            THAT’S IT!

            That’s the animal I want to be.

            • HugeNerd@lemmy.ca
              link
              fedilink
              arrow-up
              4
              ·
              2 days ago

              Try this

              “which plant has the most anuses”

              AI Overview
              The plant with the most “anuses” (or rather, the most posterior ends with a functional digestive system) is the marine worm Ramisyllis multicaudata. This worm, found in sponges off the coast of Australia, has a single head but can have hundreds of branching bodies, each ending in a separate posterior end with a functional anus.

              While plants don’t have anuses in the traditional sense, R. multicaudata is notable for its multiple, branching posterior ends, each with its own anus. This is highly unusual for an animal, as most animals have a single posterior end. The worm’s body branches repeatedly, and with each branch, the digestive system, along with other organs, is duplicated, resulting in multiple posterior ends.

              • T156@lemmy.world
                link
                fedilink
                English
                arrow-up
                4
                ·
                1 day ago

                A worm isn’t a plant, though. At least, not unless biology has changed considerably since I was last in school.

                • HugeNerd@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  5
                  ·
                  edit-2
                  1 day ago

                  I know, just shows AI patches words together according to some kind of probability based on the entirety of human writing. So if you ask something off kilter you get off kilter responses. AI doesn’t “understand”.

  • Photuris@lemmy.ml
    link
    fedilink
    arrow-up
    75
    arrow-down
    2
    ·
    3 days ago

    “Yeah, dude, I wasn’t asking ChatGPT, I was asking you!!”

    That guy is weird af.

  • recursive_recursion they/them@lemmy.ca
    link
    fedilink
    English
    arrow-up
    138
    arrow-down
    1
    ·
    edit-2
    3 days ago

    Honestly that’s the same with one of our friends.

    He got sucked into the LLM rabbit hole and now just occasionally says some weird shit no one interacts with.

    I have a feeling that brainrot is accelerated in these kinds of people due to a positive feedback loop as they become ostracized due to a noticible “self-deterioration”.

    Use LLM -> become brainrot -> can’t connect with others -> use more LLM -> become more brainrot -> more ostracized from society -> ad nauseum.

    • Cousin Mose@lemmy.hogru.ch
      link
      fedilink
      arrow-up
      32
      ·
      3 days ago

      They’re pushing LLMs so fucking hard at work but I finally destroyed my personal OpenAI account and decided to go back to actually researching topics.

      It just got to the point that I got tired of constantly rewriting the same fucking problem 20 million ways in hopes of finally getting the right answer. I kept noticing that if I just slowed down and looked at what it was doing I could find the flaw myself in seconds.

    • slaneesh_is_right@lemmy.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      2 days ago

      Way before chatgpt, i had a good friend who was kind of behind. He was pretty much the only person i knew without a smartphone. Non of my friend group had social media, so it’s not like it mattered much. We would talk for hours about movies and books we read. We talked about hidden meanings behind movies, if we couldn’t remember what actors were in a movie, we just discussed it and talked about it and maybe eventually we figured it out. Or not.

      One day, he got a new iphone and that was basically when we stopped hanging out. He became terminally online, and we couldn’t have a conversation anymore. Every conversation i tried to have with him was just him googling the answer. What do you think about that movie? I’ll ask imdb if the movie is good. It was more like talking to google itself than an actual person.

      I think that’s what the future is gonna be like. Everyone you talk to may just ask chatgpt for the “right” answer or the “best” thing to say. It’s already happening on dating platforms, where a lot of women i see just have the same generic AI introduction and say that they ask chatgpt for advice. That coupled with the fakest, AI enhanced, filter filled pictures, who are you even talking to? Not a real person it seems.

  • Mac@mander.xyz
    link
    fedilink
    English
    arrow-up
    36
    ·
    3 days ago

    “Jackson, what the fuck was that? Don’t ever do that again. Fucking ew.”

  • mysticpickle@lemmy.ca
    link
    fedilink
    arrow-up
    15
    ·
    2 days ago

    Dunno, sounds more like it was passive aggressive signal that he wasn’t interested in the conversation to me.

    • Rayquetzalcoatl@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      18
      ·
      2 days ago

      Hahaha, sorry, I know the suspense must be killing you 😂 I said binturong, because they’re my fave animal, and the one time I saw one in real life it just lay around sighing and huffing which is sort of my lifestyle choice too

        • Rayquetzalcoatl@lemmy.worldOP
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          They so do!!!

          They have prehensile tails and their glands smell like popcorn! Apparently, I didn’t shove my face in there to test tbh

          • Klear@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 days ago

            and their glands smell like popcorn

            - I would be a binturong!

            - Why?

            - …because they’re cute? Yeah, let’s go with that.

            • Rayquetzalcoatl@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 days ago

              Unfortunately, I did give the glandular answer 😬 you’re telling me you didn’t pick your answer due to glands? What was your answer? 👀

            • Rayquetzalcoatl@lemmy.worldOP
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 days ago

              Honestly, those guys always gave me the creeps - just the endless voids inside their mouths…

              However, they are also exceptionally cool, and huge. They remind me of classical paintings of sea monsters!

              Also pretty sure they’re in the seas off the coasts of the UK which is cool, I think we can see them if we’re lucky sometimes!!! Very dopey faces too, they’re cute. Scary, to me, but cute. 😂

    • MintyAnt@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      2 days ago

      OP didn’t know what a fursona was until they Google searched “I WANT TO FUCK THE BUNNY FROM ZOOTOPIA”

  • Peppycito@sh.itjust.works
    link
    fedilink
    arrow-up
    47
    arrow-down
    2
    ·
    3 days ago

    My father in law is that guy. He loves tech and gadgets and new things. He makes Ai characters of us. We all tell him we hate them and that it’s slop and he says “ya, it’s so cool”

    Fuckin boomers, man.

    • Rayquetzalcoatl@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      The endless AI trends, jesus. Do you remember when the trend was to make the AI generate a picture of somebody as an action figure? The marketing department at work fucking loved that. So tedious.

  • jj4211@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    2 days ago

    Ironically this might have been more interesting back in the GPT2 days, when it would generate accidentally hilarious text in response to many prompts.

    Nowadays the output is “better” and utterly boring and soulless, less chaotically off topic, without a hint of creativity or personal relevance, and delivered with a grating fake “jovial” tone. This is besides the awkward break in flow to pause a conversation to interact with an app.

    • captainlezbian@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 day ago

      Yeah its not my main issue with AI but it is an aesthetic issue. Personally I prefer the blunt feminine voice for my machines. But anything that feels like artificial intelligence should be used.

      • jj4211@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        21 hours ago

        Even without voice, the word choice itself is just a bit annoying to me.

        Hearing the voice read it out just makes it more annoying.