If you have modern hardware, you can absolutely train that at home. Or very affordable on a cloud service.
I’ve seen a number of “DIY GPT-2” tutorials that target this sweet spot. You won’t get amazing results unless you want to leave a personal computer running for a number of hours/days and you have solid data to train on locally, but fine-tuning should be in the realm of normal hobbyists patience.
Hmm is there anything reasonably ready made* for this spot? Training and querying a llm locally on an existing codebase?
* I don't mind compiling it myself but i'd rather not write it.