• How to jailbreak ChatGPT and Bing AI — to see their evil twins

    Home » Forums » Newsletter and Homepage topics » How to jailbreak ChatGPT and Bing AI — to see their evil twins


    PUBLIC DEFENDER By Brian Livingston The world has gone gaga for chatbots: text-based artificial intelligence apps like Open AI’s ChatGPT — which Micro
    [See the full post at: How to jailbreak ChatGPT and Bing AI — to see their evil twins]

    15 users thanked author for this post.
    Viewing 4 reply threads
    • #2537520

      A very interesting article. Thank you.

      I think Elon Musk’s ‘Yikes’ tweet may go down in history as perhaps the most understated note of alarm, if not abject terror, ever.

    • #2537674

      It’s impossible to prove that this is the case, but given that Bing AI creates its answers based on things previously written, and since humans have written so much sci-fi about sentient evil AIs, when Bing AI is prompted to explore itself I wonder if it might be calling on these sources.

      3 users thanked author for this post.
    • #2537831

      things previously written

      I think it’s worse than that. The Web is full of garbage, including things that aren’t true. Does the AI suck that up too? We humans have built-in BS sensors; does the AI?

      3 users thanked author for this post.
      • #2537846

        Absorbing things that are untrue on the internet and doubling down to defend them. What could be more human?

        Seriously though, some of the transcripts linked to in Mr. Livingston’s article certainly have Bing AI reminding me of someone desperately defending something untrue on social media, threats and all, so maybe it has scraped these sources, either as part of its general dataset or as part of ‘Prometheus’.

        • #2538317

          AI chatbots can go rogue, some alarmingly fast. Before Sydney, Microsoft developed Tay, a chatbot released in 2016. Tay was supposed to work the same way ChatGPT does, by learning based on interactions with users online. But it was shuttered after going rogue – and <u>startlingly racist</u> – within 16 hours after launch. After Tay, Microsoft built Zo, launched the same year. Zo also failed, but this time for being <u>too politically correct</u> and outright refusing to have certain conversations – much the same way that Snap’s My AI is programmed to do.


          So with proper rules, AI does not need to go rogue. But Microsoft and Google have taken shortcuts. 38 rules may seem like a lot, but it’s nothing compared with how Siri, Alexa and other voice assistants are trained. It takes years and decades to get AI right for general public use.

          The two tech giants have jumped the gun. Much the way Elon Musk’s Tesla jumped the gun on full self-driving mode. Musk never gave up on AI. He simply repurposed it. With actual real-world deadly consequences.  And a recall which looks more like an “oops” than fair compensation for his victims.

          -- rc primak

          2 users thanked author for this post.
    • #2540453

      Forget about using an AI as a search engine with today’s technology. Given the screaming errors that bots are now famous for proclaiming, you can’t use a chatbot to search the Web unless you already know the correct answer. If you’re not a subject expert, how would you know that a chatbot’s possibly made-up “facts” were all wrong?

      At what point in the future will artificial intelligence prove reliable as search engine?

      Carpe Diem {with backup and coffee}
      offline▸ Win10Pro 2004.19041.572 x64 i3-3220 RAM8GB HDD Firefox83.0b3 WindowsDefender
      offline▸ Acer TravelMate P215-52 RAM8GB Win11Pro 22H2.22621.1265 x64 i5-10210U SSD Firefox106.0 MicrosoftDefender
      online▸ Win11Pro 22H2.22621.1778 x64 i5-9400 RAM16GB HDD Firefox114.0b8 MicrosoftDefender
      • #2540474

        I’ve had two useful responses from Bing Chat in the last couple of days, explaining the differences between two similar things in both cases. (One was “BACS vs. CHAPS”, the other I’ve forgotten right now.) I could have done my own comparison of separate searches, but it’s more convenient to have a synopsis laid out in one place:

        BACS and CHAPS are two types of electronic payment systems in the UK. BACS payments are usually slow but free or cheap, while CHAPS payments are fast but expensive¹². BACS payments take 3 working days to clear, while CHAPS payments clear on the same working day². BACS payments are mainly used for direct debits and credits, while CHAPS payments are mainly used for large, one-off transfers²³.

        Source: Conversation with Bing, 03/03/2023

        Windows 11 Pro version 22H2 build 22621.1778 + Microsoft 365 + Edge

    • #2551284

      I’m awful late here. But I’ve apparently been let in to use the AI-powered Bing, and I find it annoyingly persistent. I want to look for a way to turn it off.

      Thanks for this piece, Brian. As always, great info. It’s frightening.

    Viewing 4 reply threads
    Reply To: How to jailbreak ChatGPT and Bing AI — to see their evil twins

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: