• @jonne@infosec.pub
    link
    fedilink
    English
    1030 days ago

    Yeah, and the way it will confidently give you a wrong answer instead of either asking for more information or saying it just doesn’t know is equally annoying.

    • @Jesus_666@lemmy.world
      link
      fedilink
      English
      530 days ago

      Because giving answers is not a LLM’s job. A LLM’s job is to generate text that looks like an answer. And we then try to coax framework that into generating correct answers as often as possible, with mixed results.