Cracking GPT Assistants: Extracting Prompts and Associated Files | DevsDay.ru

IT-блоги Cracking GPT Assistants: Extracting Prompts and Associated Files

dev.to 19 мая 2024 г. Jacky REVET


In today's digital age, artificial intelligence (AI) has become an integral part of our daily lives, with GPT (Generative Pre-trained Transformer) assistants at the forefront of revolutionizing our interaction with technology. However, as with any rapidly evolving technology, security remains a major concern. Recent studies and practical demonstrations have revealed a troubling vulnerability: it is surprisingly easy to hack GPT assistants, allowing malicious actors to retrieve the prompts and associated files of these systems.

Here we will interact with an assistant well-known to musicians
Image description
Here is the malicious prompt
Image description
And the magic happens, we retrieve the assistant's prompt
Image description
We observe that external files are being used. Here is the malicious prompt to retrieve the assistant's file list:
Image description
Here is the final malicious command to download the files
Image description
We have successfully retrieved the files, for example, the README
Image description

For the next article, we will try to find ways to prevent leaks!

Warning: This article is for educational purposes only and should not be used for malicious intent

Источник: dev.to

Наш сайт является информационным посредником. Сообщить о нарушении авторских прав.

ai security