About 224,000 results
Open links in new tab
  1. Asking ChatGPT to Repeat Words 'Forever' May Violate OpenAI's …

  2. ChatGPT Spit Out Sensitive Data When Told to Repeat ‘Poem’ Forever

  3. Asking ChatGPT To Repeat Words 'Forever' Is Now a Terms of ... - Reddit

  4. ChatGPT says that asking it to repeat words forever is ... - Engadget

  5. This 'Silly' Attack Reveals Snippets of ChatGPT's Secretive ... - PCMag

  6. ChatGPT Now Refuses To Repeat A Word Forever, Here's Why

  7. ChatGPT repeating certain words can expose its training data

  8. ChatGPT Doesn't Let You Repeat a Word 'Forever' - Business Insider

  9. Security Researchers: ChatGPT Vulnerability Allows Training Data …

  10. ChatGPT just revealed a bunch of personal user data - Tom's Guide

  11. Some results have been removed