I’ve been playing with Gemma4, and in one instance it took me something like half an hour to have it acknowledge a statement I made, denying the contents of an official Google page I asked it to search and parse. It lied, and at some point suggested I was hallucinating! If you install it and try it, put it to the test, hard.

  • cookiecoookie@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    9 hours ago

    Google’s AI has been way off track for a long time. Even their Pro Gemini models are almost always wrong when it comes to anything technical.

  • Chahk@beehaw.org
    link
    fedilink
    arrow-up
    3
    ·
    13 days ago

    Here’s the thing. Every LLM “answer” is a hallucination, just some happen to be closer to reality than others.