• LostWanderer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    3 days ago

    ROFL What the actual fuck?! This lady is off her rocker, LLMs aren’t alive, and can’t be your friend. What a weirdo!

    • GrindingGears@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      I actually think it’s quite normal. LLMs are basically the next wave of social media, where they suck you in, learn quickly to feed you something you are interested in, make you feel like you are the smartest person in the world and just basically grenade endorphins while also speaking misinformation, spew all sorts of half truths and all sorts of other statistically induced algorithmic slop. It’s worse than Heroin, no lies, and is far more dangerous than social media.

      • LostWanderer@fedia.io
        link
        fedilink
        arrow-up
        2
        ·
        22 hours ago

        By design, LLM makers did indeed want to produce this effect in other people…Getting them hooked on the LLM usage, and convert them into paying customers. As addicted people will pay in order to get a fix from their LLM hallucination engine that has a complaisant tendency. Thankfully, I was pretty unimpressed by my brief experimentation with LLMs, as I knew going in they were not it. Which was true, because the amount of lies passed off as truth within the few queries I submitted…Kept me safe, as I checked each answer carefully and found them all lacking.