• Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle

    Home » Forums » Cyber Security Information and Advisories » Code Red – Security/Privacy advisories » Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle

    Author
    Topic
    #2591284

    https://arstechnica.com/information-technology/2023/10/sob-story-about-dead-grandma-tricks-microsoft-ai-into-solving-captcha/

    Bing Chat, an AI chatbot from Microsoft similar to ChatGPT, allows users to upload images for the AI model to examine or discuss. Normally, Bing Chat refuses to solve CAPTCHAs, which are visual puzzles designed to prevent automated programs (bots) from filling out forms on the web. On Saturday, X-user Denis Shiryaev devised a visual jailbreak that circumvents Bing Chat’s CAPTCHA filter by tricking it into reading the inscription on his imaginary deceased grandmother’s locket…

    Reply To: Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: