• andrewth09@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    16 hours ago

    I don’t understand why the researcher needed to content the FBI to report this, just drop it in BugCrowd and call it a day. It’s a ChatGPT jailbreak, not a Debian Zero-day.

  • Vendetta9076@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    18 hours ago

    I don’t understand why we keep caring about this. AI “guard rails” will never be 100%. But beyond that everything is a google search away. Why does this matter?

  • Kaboom@reddthat.com
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 day ago

    So it’ll tell you how to make weapons and stuff like that? The instructions to make a pipe bomb and a shotgun are freely available, and even if they weren’t, it’s not hard to figure out.

    I don’t understand his anxiety.

    • ChicoSuave@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 day ago

      It’s for the people who can’t figure it out and the only thing preventing them from using the knowledge poorly are clear instructions.