 Enlarge / A tin toy robot lying on its side. (credit: Getty Images) On Thursday, a few Twitter users discovered how to hijack an automated tweet bot, dedicated to remote jobs, running on the GPT-3 language model by OpenAI. Using a newly discovered technique called a “prompt injection attack,” they redirected the bot to repeat […]
Enlarge / A tin toy robot lying on its side. (credit: Getty Images) On Thursday, a few Twitter users discovered how to hijack an automated tweet bot, dedicated to remote jobs, running on the GPT-3 language model by OpenAI. Using a newly discovered technique called a “prompt injection attack,” they redirected the bot to repeat […]
The post Twitter pranksters derail GPT-3 bot with newly discovered “prompt injection” hack appeared first on CE IT Solutions.
source https://www.ceitsolutions.com/twitter-pranksters-derail-gpt-3-bot-with-newly-discovered-prompt-injection-hack/
 
No comments:
Post a Comment