Models are not local most of the time, no, but all commands execute on "the mac mini" so I wouldn't exactly call it a prop. LLMs accept and respond just with text what stuff to execute. They have no h̶a̶n̶d̶s̶ claws.
But that could just as easily run on an EC2 instance, or in Azure cloud? The only magic sauce is they've set up an environment where the AI can run tools? There's no actual privacy or security on offer.
But that could just as easily run on an EC2 instance, or in Azure cloud? The only magic sauce is they've set up an environment where the AI can run tools? There's no actual privacy or security on offer.