Researchers Jailbreak AI by Flooding It With Bullshit Jargon
You can trick AI chatbots like ChatGPT or Gemini into teaching you how to make a bomb or hack an ATM if you make the question complicated, full of academic jargon, and cite sources that do not exist. That’s the conclusion of a new paper authored by a team of researchers from Intel, Boise State…