Of course not. Users love the chatbot. It's fast and easier to use than manually searching for answers or sticking together reports and graphs.
There is no latency, because the inference is done locally. On a server at the customer with a big GPU
> There is no latency
Every chat bot I was ever forced to use has built-in latency, together with animated … to simulate a real user typing. It’s the worst of all worlds.
> There is no latency
Every chat bot I was ever forced to use has built-in latency, together with animated … to simulate a real user typing. It’s the worst of all worlds.