Researchers say a prompt injection bug in Google's Antigravity AI coding tool could have let attackers run commands, despite safeguards.
Researchers say a prompt injection bug in Google's Antigravity AI coding tool could have let attackers run commands, despite safeguards.