• greybeard@feddit.online
    link
    fedilink
    English
    arrow-up
    4
    ·
    18 days ago

    One thing I struggle with AI is the answers it gives always seem plausable, but any time I quiz it on things I understand well, it seems to constantly get things slightly wrong. Which tells me it is getting everything slightly wrong, I just don’t know enough to know it.

    I see the same issue with TV. Anyone who works in a compicated field has felt the sting of watching a TV show fail to accurate represent it while most people watching just assume that’s how your job works.

    • HubertManne@piefed.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 days ago

      This is where you have to check out the reference links it gives as if they were search results and the less you know the more you have to do it. I mean people have been webMDing for a long time. None of these things allow folks to stop critical thinking. If anything it requires it even more. This was actually one of my things with ai and work. The idea is for it to allow people with less knowledge to do things and to me its kinda the reverse.

    • noughtnaut@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      17 days ago

      This is what I call “confidently wrong”. If you ask it about things you have no clue about, it seems incredibly well-informed and insightful. Ask it something you know deeply, and you’ll easily see it’s just babbling and spouting nonsense - sure makes you wonder about those earlier statements it made, doesn’t it?