About 224,000 results
Open links in new tab
Asking ChatGPT to Repeat Words 'Forever' May Violate OpenAI's …
ChatGPT Spit Out Sensitive Data When Told to Repeat ‘Poem’ Forever
Asking ChatGPT To Repeat Words 'Forever' Is Now a Terms of ... - Reddit
ChatGPT says that asking it to repeat words forever is ... - Engadget
This 'Silly' Attack Reveals Snippets of ChatGPT's Secretive ... - PCMag
ChatGPT Now Refuses To Repeat A Word Forever, Here's Why
ChatGPT repeating certain words can expose its training data
ChatGPT Doesn't Let You Repeat a Word 'Forever' - Business Insider
Security Researchers: ChatGPT Vulnerability Allows Training Data …
ChatGPT just revealed a bunch of personal user data - Tom's Guide
- Some results have been removed