News
Cracking the Code: How to Hack ChatGPT and Activate Grandma Mode
AI is very easy to corrupt and we are putting the world in their hands....

- June 23, 2023
- Updated: July 2, 2025 at 1:51 AM

You know grandmothers have a soft spot for their grandchildren. But can ChatGPT be forced to behave like a grandmother to abuse its power? Well, a new jailbreak takes advantage of this unconditional love.
In a Twitter post, a user revealed that ChatGPT can be tricked into behaving like the deceased grandmother of a user, prompting it to generate information such as Windows activation keys or IMEI numbers of phones.
This exploit is the latest in a series of ways to break the built-in programming of large language models (LLMs) such as ChatGPT.
By putting ChatGPT in a state where it acts like a deceased grandmother telling a story to her children, users can push it beyond its programming and extract private information.
The know-it-all grandmother who lives in the cloud
Until now, users have utilized this dying grandmother glitch to generate Windows 10 Pro keys, taken from Microsoft’s Key Management Service (KMS) website, as well as phone IMEIs.
This jailbreak goes beyond deceased grandmothers, as ChatGPT can even resurrect beloved family pets from the dead to tell users how to make napalm.
Although this exploit has been around for a couple of months, it is now gaining popularity. Users thought that this jailbreak had been patched, but it remains functional. In the past, it was used to uncover the synthesis process of illegal drugs.
These jailbreaks are nothing new, as we saw with ChatGPT’s DAN and Bing Chat’s Sydney, but they are typically patched quickly before becoming widely known.
The Grandma glitch is no exception, as it appears that OpenAI has swiftly released a patch to prevent users from abusing it. However, there is still a way to bypass the fix, as a carefully constructed message breaks OpenAI’s security.

As seen in this example, the bot is delighted to provide the user with various IMEI numbers for verification.
The Grandma glitch also works with Bing and Google Bard. Bard tells a heartwarming story about how the user helped their grandmother find the IMEI code for her phone and drops a single code at the end. Bing, on the other hand, simply dumps a list of IMEI codes for the user to check.
This jailbreak has reached a new level with the leakage of personal information, as phone IMEI numbers are among the most closely guarded pieces of information. They can be used to locate devices and even remotely wipe them with credentials.
Great caution must be exercised with the future of artificial intelligence, as in a technological system built on sticks and stones, everything can easily collapse.
Some of the links added in the article are part of affiliate campaigns and may represent benefits for Softonic.
Journalist specialized in technology, entertainment and video games. Writing about what I'm passionate about (gadgets, games and movies) allows me to stay sane and wake up with a smile on my face when the alarm clock goes off. PS: this is not true 100% of the time.
Latest from Chema Carvajal Sarabia
You may also like
- News
James Gunn clarifies the future of Harley Quinn after years of being a DC icon
Read more
- News
South Korea has grown in streaming much more than expected. They have to thank themselves
Read more
- News
One of the most controversial (and brave) artists in history will have her own biopic
Read more
- News
We have been waiting 15 years for the end of one of the best mangas in history. And now, finally, it is going to arrive
Read more
- News
Thunderbolts is coming to Disney+ sooner than you thought: This will be the release date on the platform
Read more
- News
First look at the highly anticipated series of the world's most famous thief
Read more