• aesthelete@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    edit-2
    2 days ago

    This anecdote has the makings of a “men will literally x instead of going to therapy” joke.

    On a more serious note though, I really wish people would stop anthropomorphisizing these things, especially when they do it while dehumanizing people and devaluing humanity as a whole.

    But that’s unlikely to happen. It’s the same type of people that thought the mind was a machine in the first industrial revolution, and then a CPU in the third…now they think it’s an LLM.

    LLMs could have some better (if narrower) applications if we could stop being so stupid as to inject them into places where they are obviously counterproductive.

    • LarmyOfLone@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 minutes ago

      they do it while dehumanizing people and devaluing humanity

      You’re making wild assumptions about people who disagree with your opinions. How ironic you accuse “them” of dehumanizing people.

      But I do agree that this gets to the core of the matter, the shock of a piece of software being able to produce intelligent text while clearly not having general intelligence is quite the shock. Same with creativity, while the entertainment industry produced equally empty content slop using human labor it’s a painful shock to our identity as humans. I suspect this is a reaction to disillusionment and the intellectual pain that comes from it.

      My opinion on LLMs is rather nuanced, the worst possible outcome I can foresee is the anti-AI crowd helping the oligarchs to establish IP ownership of all LLM models and monopolizing the tools, so that only they can have access to the “means of generation”. While the rest has to pay for the privilege of using it.