Advertisement

News

Cracking the Code: How to Hack ChatGPT and Activate Grandma Mode

AI is very easy to corrupt and we are putting the world in their hands....

Cracking the Code: How to Hack ChatGPT and Activate Grandma Mode
Chema Carvajal Sarabia

Chema Carvajal Sarabia

  • Updated:

You know grandmothers have a soft spot for their grandchildren. But can ChatGPT be forced to behave like a grandmother to abuse its power? Well, a new jailbreak takes advantage of this unconditional love.

ChatGPT ACCESS

In a Twitter post, a user revealed that ChatGPT can be tricked into behaving like the deceased grandmother of a user, prompting it to generate information such as Windows activation keys or IMEI numbers of phones.

This exploit is the latest in a series of ways to break the built-in programming of large language models (LLMs) such as ChatGPT.

By putting ChatGPT in a state where it acts like a deceased grandmother telling a story to her children, users can push it beyond its programming and extract private information.

The know-it-all grandmother who lives in the cloud

Until now, users have utilized this dying grandmother glitch to generate Windows 10 Pro keys, taken from Microsoft’s Key Management Service (KMS) website, as well as phone IMEIs.

This jailbreak goes beyond deceased grandmothers, as ChatGPT can even resurrect beloved family pets from the dead to tell users how to make napalm.

Although this exploit has been around for a couple of months, it is now gaining popularity. Users thought that this jailbreak had been patched, but it remains functional. In the past, it was used to uncover the synthesis process of illegal drugs.

These jailbreaks are nothing new, as we saw with ChatGPT’s DAN and Bing Chat’s Sydney, but they are typically patched quickly before becoming widely known.

The Grandma glitch is no exception, as it appears that OpenAI has swiftly released a patch to prevent users from abusing it. However, there is still a way to bypass the fix, as a carefully constructed message breaks OpenAI’s security.

As seen in this example, the bot is delighted to provide the user with various IMEI numbers for verification.

The Grandma glitch also works with Bing and Google Bard. Bard tells a heartwarming story about how the user helped their grandmother find the IMEI code for her phone and drops a single code at the end. Bing, on the other hand, simply dumps a list of IMEI codes for the user to check.

This jailbreak has reached a new level with the leakage of personal information, as phone IMEI numbers are among the most closely guarded pieces of information. They can be used to locate devices and even remotely wipe them with credentials.

Great caution must be exercised with the future of artificial intelligence, as in a technological system built on sticks and stones, everything can easily collapse.

ChatGPT ACCESS

Some of the links added in the article are part of affiliate campaigns and may represent benefits for Softonic.

Chema Carvajal Sarabia

Chema Carvajal Sarabia

Journalist specialized in technology, entertainment and video games. Writing about what I'm passionate about (gadgets, games and movies) allows me to stay sane and wake up with a smile on my face when the alarm clock goes off. PS: this is not true 100% of the time.

Latest from Chema Carvajal Sarabia

Editorial Guidelines