> I think people who kept saying there is no moat in AI is about to be shocked at how strong of a moat there actually is for ChatGPT.
Game on. The systemic risk to the AI build-out happens when memory management techniques similar to gaming and training techniques that make them usable reduce the runtime memory footprints from gigabytes to megabytes, much of which fits in L2. When that happens, the data center will bleed back to the edges. Demand will find its way into private, small, local AI that is consultative, online trained, and adapted to the user's common use cases. The asymptote is emergent symbolic reasoning, and symbolic reasoning is serial computation that fits on a single core CPU. Game on, industry.