I used to worry that using LLMs to code would let them use my code and train on my hard work. Then I realised how bad my code is, so I'm probably singlehandedly holding off an agi catastrophe.