AI-powered Bing Chat spills its secrets via prompt injection attack

By asking “Sydney” to ignore previous instructions, it reveals its original directives. Read more

Similar