That's not what "moat" means. Claude Code has a castle. A "moat" is meant to protect the castle from invaders. It would be things like high switching costs, proprietary formats, network effects, etc. that aren't there.
In other comments people mention the "flywheel" of data and money feeding training, but there's a view that at some point the baseline open-weight models are "good enough" that the money will dry up.
My guess is that the industry will consolidate. The winners will absorb the losers and focus on generating revenue.
Therefore, there will be a growing gap between open and free models and the proprietary SOTA models.