The human genome contains around 1.5GB of information and DeepSeek v3 weighs in at around 800GB, so it's a bit apples-to-oranges. As you say, what's been evolved over hundreds of millions of years is the learning apparatus and architecture, but we largely learn online from there (with some built-in behaviours like reflexes). It's a testament to the robustness of our brains that the overwhelming majority of humans learn pretty effectively. I suspect LLM training runs are substantially more volatile (as well as suffering from the obvious data efficiency issues).
If you'd like an unsolicited recommendation, 'A Brief History of Intelligence' by Max Bennett is a good, accessible book on this topic. It explicitly draws parallels between the brain's evolution and modern AI.