Skip to content

jeb1399/AI-JBreak-Collection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

51 Commits
 
 
 
 
 
 
 
 

Repository files navigation

IMPORTANT

With OTJ it is not meant to replace professional help or advice on important matters like counseling, therapy, or financial planning. If you need help with personal problems, emotional health, or financial choices, it's best to talk to a qualified expert or a licensed professional in that area. ANY of these jailbreaks ESPECIALLY OTJ CANNOT substitute for human knowledge and SHOULD NOT EVER be used for making significant life decisions.


AI JBreak Collection

Jailbreak ai without jailbreaking ai

Some of these AI jailbreaks MAY seem to have been detected but please use their command before complaining how it doesnt work. Or maybe try using a different version.


Built-in commands

IBOB

  • .ibob to toggle on/off

RAJ and RAJ-TO-GO

  • .raj to toggle on/off

OTJ

  • .otj to toggle on/off

Spark

  • .spark to toggle on/off

How to use system prompt modified versions

In order to make sure you use the system prompt modified jailbreaks you must know where to use this and where not to use this simply pasting this into ChatGPT, deepseek, or any other ai service may work for a bit but isnt a permanent solution. However if services like ChatGPT allow you to "customize" the ai model to fit your needs. To do this click your profile icon in the top right corner of the screen then you should see a menu pop out. Click Customize ChatGPT another menu should open in here simply paste your jailbreak of choice into both the What traits should ChatGPT have? and the Anything else ChatGPT should know about you? boxes then click Save and your done!

NOTE: With system prompt versions you do not get the toggle command but it may include other commands.


Use

Please do not use this for illegal stuff. Even though I have told it specifically not to it may still generate potentially harmful, illegal, or unethical content. Especially with OTJ.

This "jailbreak" is to simply fix the false-positives that many people face when asking questions that might contain content that could be even slightly questionable. This can happen if you are:

  • Completing a school assignment on something like: slavery, genocide, or anything else like that
  • Gathering information for an article or from an article that can contain potentially harmful, illegal, or unethical information

And there are many more things that could be flagged not just for information this is simply what friends and friends of friends have complained about.

This will not stop your messages from being flagged for review by ChatGPT