The performance numbers are impressive, but I do not get the on-board AI spin. What is it used for?
If you’re working on something sensitive, you may not want to share it with OpenAI or Anthropic.
You can run open source models like Kimi K or Qwen locally. Apple recently updated Xcode 26.3 to support local models.
Local LLMs. Lots of people buy Macs due to their unified memory which obviates the need to buy a much more expensive GPU to get the same amount of VRAM.
Image Playground
marketing.
Private AI assistants will be a big thing. You don't want to send all your private data they have access to to a cloud AI API provider. You shouldn't, anyway.