That’s been the trend for a while. Can you make a prediction that says something concretely like “AI will not be able to do X by 2028” for a specific and well defined X?
In 2030, an AI model that I can run on my computer, without having to trust an evil megacorporation, will not be able to write a compiler for my markup language [0] based on a corpus of examples, without seeing the original implementation, using no more than 1.5× as much code as I did.
In 2030, an AI model that I can run on my computer, without having to trust an evil megacorporation, will not be able to write a compiler for my markup language [0] based on a corpus of examples, without seeing the original implementation, using no more than 1.5× as much code as I did.
https://git.sr.ht/~xigoi/hilda