You may be underestimating the powers of trillions of parameters in a model. With this many parameters overfitting is inevitable. Overfitting here means you are plotting (or outputting) the errors in your data instead of interpolating (or inferring) any trends in the model.
In fact, given this many parameters, poisoning should be relatively easy in general, but extremely easy on niche subjects.
>With this many parameters overfitting is inevitable.
Nope. Go look up double descent. Overfitting turns out not to be an issue with large models.
Your video is from a political activist, not anyone with any knowledge about machine learning. Here's a better video about overfitting: https://youtu.be/qRHdQz_P_Lo