Conventional LLM's are moving fast too. The argument is that OpenClaw isn't any more useful than conventional LLM's, and I suspect it will always be true because the conventional LLM's will gain any useful capabilities.
I think openclaw provides a unique feature of a standardized host environment for a persistent assistant. This is different than the chat interfaces that are presented by anthropic/openai/others that give you a 'while you are here' assistant interface and is very different from the idea of trained llm weights and ways of serving them up like llama.cpp and others. There really is something unique here that will evolve over time I think.
I think openclaw provides a unique feature of a standardized host environment for a persistent assistant. This is different than the chat interfaces that are presented by anthropic/openai/others that give you a 'while you are here' assistant interface and is very different from the idea of trained llm weights and ways of serving them up like llama.cpp and others. There really is something unique here that will evolve over time I think.