Meta “programmed it to simply not answer questions,” but it did anyway.

  • Terrasque@infosec.pub
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    7
    ·
    2 months ago

    That’s like saying car crash is just a fancy word for accident, or cat is just a fancy term for animal.

    Hallucination is a technical term for this type of AI, and it’s inherent to how it works at it’s core.

    And now I’ll let you get back to your hating.

    • CileTheSane@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 months ago

      Hallucination is also wildly misleading. The AI does not believe something that isn’t real, it was incorrect in the words it guessed would be appropriate.