I don’t think it gave him the openAI key, he just had the ability to send as many hijacked (not game related) prompts as he wanted through the game on the devs’ dime.
The text prompt in the game might also be vulnerable to arbitrary code injection, but that wouldn’t really have anything to do with the prompt injection being used here. Everything being done is within the confines of chatGPT which wouldn’t need or have access to any of the game’s code.
They didn’t. The point was that the guy could use their implementation freely as if he was paying for a chat gpt license. Basically he made the ai let him run any query he wanted trough it so he just has unlimited access to the paid version of chat gpt at the company’s expense
Right. I don’t know how the hell someone managed to reveal their OpenAI key to the LLM itself
I don’t think it gave him the openAI key, he just had the ability to send as many hijacked (not game related) prompts as he wanted through the game on the devs’ dime.
Which, now given the ability to inject arbitrary code, you could conceivably now write code to list every variable it had access to.
The text prompt in the game might also be vulnerable to arbitrary code injection, but that wouldn’t really have anything to do with the prompt injection being used here. Everything being done is within the confines of chatGPT which wouldn’t need or have access to any of the game’s code.
They didn’t. The point was that the guy could use their implementation freely as if he was paying for a chat gpt license. Basically he made the ai let him run any query he wanted trough it so he just has unlimited access to the paid version of chat gpt at the company’s expense