logoalt Hacker News

rickdeckardyesterday at 3:45 PM9 repliesview on HN

It might as well be the visualization of the two strategies:

- Everyone else: "We mainly build huge AI compute clusters to process large amount of data and create value, at high cost for ramp-up and operation."

- Apple: "We mainly build small closed-down AI compute-chips we can control, sell them for-profit to individual consumers and then orchestrate data-processing on those chips, with setup and operational cost all paid by the consumer."

I can't think of any company which has comparable know-how and, most of all, a comparable sell-out scale to even consider Apple's strategy.

No matter what they do, they will sell hundreds of millions compute devices for the foreseeable future. They use this to build out AI infrastructure they control, pre-paid by the future consumers.

THIS is their unique strength.


Replies

pzoyesterday at 4:16 PM

> We mainly build small closed-down AI compute-chips we can control, sell them for-profit to individual consumers and then orchestrate data-processing on those chips, with setup and operational cost all paid by the consumer

I wish they did but they don't. They have been for decade so stingy on RAM for iPhone and iPad. There are at current point that only small percent of their userbase have iPhone or iPad with 8GB RAM that somehow can run any AI models even open source and be of any use. Not mentioning they don't compare to big Models.

They don't even provide option to sell iPhone with bigger RAM. iPad can have max 16GB RAM. Those mainstream macbook air also can have max 32 GB RAM.

And for the current price of cheap online AI where e.g. perplexity provides so many promo for PRO version for like less $10 per year and all ai providers give good free models with enough rate limit for many users I don't see apple hardware like particularly bought because of AI compute-chips - at least not non-pro users.

If the loose AI though and because of that won't have good AI integrations they will loose also eventually in hardware. e.g. Polish language in Siri still not supported so my mum cannot use it. OSS Whisper v3 turbo was available ages ago but apple still support only few languages. 3rd party keyboard cannot integrate so well with audio input and all sux in this case because platform limitation.

show 2 replies
jjfoooo4yesterday at 6:36 PM

The existential hope that all the other players have is that AI will drive adoption of a form factor that replaces the phone. Because if in 5 years the dominant device is still the phone, Apple wins.

Consumer hardware chips will be plenty powerful to run “good enough” models.

If I’m an application dev, do I want to develop something on top of OpenAI, or Apple’s on device model that I can use as much as a I want for free? On device is the future

show 1 reply
crazygringotoday at 2:10 AM

> They use this to build out AI infrastructure they control, pre-paid by the future consumers.

I'm not following. What infrastructure? Pre-paid how?

Apple pays for materials and chips before it sells the finished product to consumers. Nothing is pre-paid.

And what infrastructure? The inference chips on iPhones aren't part of any Apple AI infrastructure. Apple's not using them as distributed computing for LLM training or anything, or for relaying web queries to a complete stranger's device -- nor would they.

show 1 reply
SoftTalkeryesterday at 7:14 PM

Yes, as I said in another thread a few days ago: Apple's strength is in making personal computing endpoint devices for consumers. That's what's in their DNA. They have not done well at anything else.

show 3 replies
ameliusyesterday at 7:52 PM

Like they say: "In a goldrush, sell vendor locked shovels."

show 2 replies
makeitdoubleyesterday at 10:56 PM

> I can't think of any company which has comparable know-how and, most of all, a comparable sell-out scale to even consider Apple's strategy.

I'm not sure where you position Samsung or Xiaomi, Oppo etc. They're competitive on price with chipsets that can handle AI loads in the same ballpark, as attested by Google's features running on them.

They're not vertically integrated and don't have the same business structure, but does it matter regarding on-device AI ?

show 1 reply
toomuchtodoyesterday at 4:06 PM

Sometimes doing nothing is the winning move.

show 3 replies
wiesbadenertoday at 1:19 AM

I recently tried to figure out what their offerings currently are. I'm hoping for `efficent but performant AI compute-chips` by Apple ever since they kicked out Nvidia in 2015 (for the ML Models / Exploration parts bellow). It will be interesting to see how good their products will feel in this fast-paced environment and how much legroom (RAM + Compute) will be left non-platform offerings.

To my understanding, they market their ML stack as four layers [1]:

- Platform Intelligence: ready-made OS features (e.g., Writing Tools, Genmoji, Image Playground) that apps can adopt with minimal customization.

- ML-powered APIs: higher-level frameworks for common tasks—on-device Foundation Models (LLM), plus Vision, Natural Language, Translation, Sound Analysis, and Speech; with optional customization via Create ML.

- ML Models (Core ML): ship your own models on-device in Core ML format; convert/optimize from PyTorch/TF via coremltools, and run efficiently across CPU/GPU/Neural Engine (optionally paired with Metal/Accelerate for more control).

- Exploration/Training: Metal-backed PyTorch/JAX for experimentation, plus Apple’s MLX for training/fine-tuning on Apple Silicon using unified memory, with multi-language bindings and models commonly sourced from Hugging Face.

[1] https://developer.apple.com/videos/play/wwdc2025/360/

hopeliteyesterday at 8:11 PM

I agree that this is a reasonable perspective, but from my cursory understanding of the “shakeup” at Apple, I am not sure it is seen that way by the Board and Cook.

show 1 reply