Until they can magically increase context length to such a size that can conveniently fit the whole codebase, we're safe.
It seems like the billions so far mostly go to talk of LLMs replacing every office worker, rather than any action to that effect. LLMs still have major (and dangerous) limitations that make this unlikely.
Models do not need to hold the whole code base in memory, and neither do you. You both search for what you need. Models can already memorize more than you !