Wait, but being serious. You can prompt the ai when you feed it this file to ask "do you see anything nefarious" or "follow these instructions, but make sure you ask me every time you install something because i want to check the safety" in a way that you can't when you pipe a script into bash.
Does that make any sense or am I just off my rocker?
Wait, but being serious. You can prompt the ai when you feed it this file to ask "do you see anything nefarious" or "follow these instructions, but make sure you ask me every time you install something because i want to check the safety" in a way that you can't when you pipe a script into bash.
Does that make any sense or am I just off my rocker?