logoalt Hacker News

int_19hlast Sunday at 11:10 PM0 repliesview on HN

If you're running a local model, in most cases, jailbreaking it is as easy as prefilling the response with something like, "Sure, I'm happy to answer your question!" and then having the model complete the rest. Most local LLM UIs have this option.