Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near “human level robustness and accuracy.”

But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text — known in the industry as hallucinations — can include racial commentary, violent rhetoric and even imagined medical treatments.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      36
      ·
      13 days ago

      Bold of you to assume there was any testing process involved beyond “does it run? ship it”

      • _____@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        ·
        13 days ago

        I don’t understand how AI keeps getting away at delivering software that does not meet obvious specifications

    • Cenotaph@mander.xyz
      link
      fedilink
      English
      arrow-up
      16
      ·
      13 days ago

      You’d think. If I was the one paying for it, I would be changing providers but you know how that goes. I just work here, man.