I would bet significant money that, within two years, it will become Generally Obvious that Apple has the best consumer AI story among any tech company.
I can explain more in-depth reasoning, but the most critical point: Apple builds the only platform where developers can construct a single distributable that works on mobile and desktop with standardized, easy access to a local LLM, and a quarter million people buy into this platform every year. The degree to which no one else on the planet is even close to this cannot be understated.
The thing that people seem to have forgotten is that the companies that previously attempted to monetize data center based voice assistants lost massive amounts of money.
> Amazon Alexa is a “colossal failure,” on pace to lose $10 billion this year... “Alexa was getting a billion interactions a week, but most of those conversations were trivial commands to play music or ask about the weather.” Those questions aren’t monetizable.
Google expressed basically identical problems with the Google Assistant business model last month. There’s an inability to monetize the simple voice commands most consumers actually want to make, and all of Google’s attempts to monetize assistants with display ads and company partnerships haven’t worked. With the product sucking up server time and being a big money loser, Google responded just like Amazon by cutting resources to the division.
https://arstechnica.com/gadgets/2022/11/amazon-alexa-is-a-co...
Moving to using much more resource intensive models is only going to jack up the datacenter costs.
As a sibling poster has said, I don't know how much on-device AI is going to matter.
I have pretty strong views on privacy, and I've generally thrown them all out in light of using AIs, because the value I get out of them is just so huge.
If Apple actually had executed on their strategy (of running models in privacy-friendly sandboxes) I feel they would've hit it out of the park. But as it stands, these are all bleeding edge technologies and you have to have your best and brightest on them. And even with seemingly infinite money, Apple doesn't seem to have delivered yet.
I hope the "yet" is important here. But judging by the various executives leaving (especially rumors of Johnny Srouji leaving), that's a huge red flag that their problem is that they're bleeding talent, and not a lack of money.
I don't think the throughput of a general purpose device will make a competitive offering; so being local is a joke. All the fun stuff is running on servers at the moment.
From there, AI integration is enough of a different paradigm that the existing apple ecosystem is not a meaningful advantage.
Best case Apple is among the fast copies of whoever is actually innovative, but I don't see anything interesting coming from apple or apple devs anytime soon.
> I would bet significant money that,
You can do that right now, on the stock market. Sometimes it's good to put your money where your mouth is, that forces you to correct your world view.
I don't think so.
Consumers don't care about whether an LLM is local, and one that runs on your phone is always going to be vastly worse than ChatGPT.
I see zero indication that Apple is going to replace people going to chatgpt.com or using its app.
All I see Apple doing is eventually building a better new generation of Siri, not much different from Google/Alexa.
How much, at what odds, who will decide if they do, and who will hold the money?
I'd loved to see a strong on-device multi-modal Siri + flexibility with shortcuts. Besides the "best consumer AI story" they could additionally create a strong offering to SMBs with FileMaker + strong foundation models support baked in. Actually rooting for both!
Local AI sounds nice but most of Apple’s PCs and other devices don’t come with enough RAM for a decent price needed for good model performance and macOS itself is incredibly bloated.
i'd have a lot more respect for apple's "cautious" approach to AI if they didn't keep promising and then failing to deliver siri upgrades (while still calling out to cloud backends, despite all the talk about local LLM), or if they hadn't shipped the absolute trash that is notification summaries.
i think at this point it's pretty clear that their AI products aren't bad because it's some clever strategy, it's bad because they're bad at it. I agree that their platform puts them in a good place to provide a local LLM experience to developers, but i remain skeptical that they will be able to execute on it.
Make a poly market bet, you will lose. Siri has been horrible for a decade, no way they fix that in two years.
I don't know, I feel like Apple shot themselves in the foot selling 8GB consumer laptops up until around 2024 while packing them with advanced AI inference, and usually had lower RAM on their mobile and ipads.
On the other hand all devs having to optimize for lower RAM will help with freeing it up for AI on newer devices with more.
[dead]
[dead]
[dead]
Local LLMs will never be better than cloud LLMs. They can close the gap if/when cloud LLM progress stalls.
Let's not conflate Apple's failure in cutting edge transformer models with good strategy.