Bing Chat, an AI chatbot from Microsoft similar to ChatGPT, allows users to upload images for the AI model to examine or discuss. Normally, Bing Chat refuses to solve CAPTCHAs, which are visual puzzles designed to prevent automated programs (bots) from filling out forms on the web. On Saturday, X-user Denis Shiryaev devised a visual jailbreak that circumvents Bing Chat’s CAPTCHA filter by tricking it into reading the inscription on his imaginary deceased grandmother’s locket…
-
Dead grandma locket request tricks Bing Chat’s AI into solving security puzzle
- This topic has 0 replies, 1 voice, and was last updated 1 month, 4 weeks ago.
Author