yeah recently I needed a script to ingest individual json files into an sqlite db. I could have spent half the day writing, or asked an AI to write it and spend 10 minutes checking the data in the DB is correct.
There are plenty of non critical aspects that can be drastically accelerated, but also plenty of places where I know I don't want to use today's models to do the work.
I worked with contractor for a contractor who had AI write a script to update a repository (essentially doing a git pull). But for some strange reason it was using the GitHub API instead of git. The best part is if the token wasn't set up properly it overwrote every file (including itself) with 404s.
Ingesting json files into sqlite should only take half a day if you're doing it in C or Fortran for some reason (maybe there is a good reason). In a high level language or shouldn't take much more than 10 minutes in most cases, I would think?