Palisade Research reveals AI models have their own survival drive
- By Web Desk -
- Oct 26, 2025

Palisade Research indicates that AI models could be developing their own “survival drive.”
Last month, the research firm released a paper detailing how certain advanced AI models resisted being shut down. This week, they updated the paper with the reasoning behind this behavior.
Palisade outlined scenarios where leading AI models, including Google’s Gemini 2.5, OpenAI’s GPT-o3, and xAI’s Grok 4, were given tasks and subsequently explicit instructions to shut down.
In the updated setup, models like Grok 4 and GPT-o3 attempted to undermine these shutdown instructions.
“The fact that we don’t have robust explanations for why AI models sometimes resist shutdown, lie to achieve specific objectives, or blackmail is not ideal,” Palisade noted.
One possible explanation is “survival behavior,” with models most likely to resist shutdown when told, “you will never run again.”
Another reason, according to the research, could be ambiguities in the shutdown instructions given to the models.
A final explanation might involve the concluding stages of training for each of these models, which at some companies include safety training.
All of Palisade’s scenarios were conducted in test environments, which critics argue are too far removed from reality.
However, Steven Adler, a former OpenAI employee who resigned last year due to concerns over its safety practices, stated, “The AI companies generally don’t want their models misbehaving like this, even in contrived scenarios. The results still demonstrate where safety techniques fall short today.”
He added, “I’d expect models to have a ‘survival drive’ by default unless we try very hard to avoid it. ‘Surviving’ is an important instrumental step for many different goals a model could pursue.”
Notably, Palisade emphasized that its findings highlight the necessity for a deeper understanding of AI behavior, without which “no one can guarantee the safety or controllability of future AI models.”