Sadly to go beyond an exercise, having the money is really what you need if you actually want LLMs now, not time.
Nowadays training very powerful LLMs is easy because all the tooling, source-codes, training datasets, and teaching agents are available.
Getting access to dozens of millions of USD or more is not easy, and for big players this is a just drop in their ocean.
Totally. While the LLM:s today are amazing it is a bit sad that you can’t build SOTA models on your own (vs a few years ago where someone with the skills and access to a dataset could build a state of art models)
[dead]
You seem to be talking about a production-grade model rather than building an LLM as an exercise? Or if not, why do you disagree with the article's example of building a small LLM for $100?