A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service's behavior and its interactions with users (Benj Edwards/Ars Technica)




Benj Edwards / Ars Technica:

A Stanford student using a prompt injection attack revealed the initial prompts of Bing Chat that control the service’s behavior and its interactions with users  —  By asking “Sydney” to ignore previous instructions, it reveals its original directives.  —  On Tuesday, Microsoft revealed a …





Source link

We will be happy to hear your thoughts

Leave a reply

Discounts Counter
Reset Password