What’s it like to have a diary that talks back to you, offering comments and advice on your hopes, fears and lunch plans? I spent two months finding out
I actually think it’s quite normal. LLMs are basically the next wave of social media, where they suck you in, learn quickly to feed you something you are interested in, make you feel like you are the smartest person in the world and just basically grenade endorphins while also speaking misinformation, spew all sorts of half truths and all sorts of other statistically induced algorithmic slop. It’s worse than Heroin, no lies, and is far more dangerous than social media.
By design, LLM makers did indeed want to produce this effect in other people…Getting them hooked on the LLM usage, and convert them into paying customers. As addicted people will pay in order to get a fix from their LLM hallucination engine that has a complaisant tendency. Thankfully, I was pretty unimpressed by my brief experimentation with LLMs, as I knew going in they were not it. Which was true, because the amount of lies passed off as truth within the few queries I submitted…Kept me safe, as I checked each answer carefully and found them all lacking.
ROFL What the actual fuck?! This lady is off her rocker, LLMs aren’t alive, and can’t be your friend. What a weirdo!
I actually think it’s quite normal. LLMs are basically the next wave of social media, where they suck you in, learn quickly to feed you something you are interested in, make you feel like you are the smartest person in the world and just basically grenade endorphins while also speaking misinformation, spew all sorts of half truths and all sorts of other statistically induced algorithmic slop. It’s worse than Heroin, no lies, and is far more dangerous than social media.
By design, LLM makers did indeed want to produce this effect in other people…Getting them hooked on the LLM usage, and convert them into paying customers. As addicted people will pay in order to get a fix from their LLM hallucination engine that has a complaisant tendency. Thankfully, I was pretty unimpressed by my brief experimentation with LLMs, as I knew going in they were not it. Which was true, because the amount of lies passed off as truth within the few queries I submitted…Kept me safe, as I checked each answer carefully and found them all lacking.