logoalt Hacker News

scriptsmithyesterday at 9:22 PM3 repliesview on HN

I've got some demos of what the new Prompt API in Chrome that uses a local model can do: https://adsm.dev/posts/prompt-api/#what-could-you-build-with...

As OP says, it shines in constrained environments where the model is transforming user-owned data. Definitely less useful for anything more open-ended.


Replies

2ndorderthoughtyesterday at 9:26 PM

Yea I do not recommend treating chromes prompt API as a good example of local LLMs. It's fine and stuff but it's really weak. 8b models from a year ago are better in some ways. And a lot of the recent model drops are meaningfully better.

show 1 reply
robot-wrangleryesterday at 11:17 PM

> I've got some demos of what the new Prompt API can do: > Use surrounding context to rewrite your ad copy:

Yup, that's the plan. No local model, no webpage; more, better and cheaper adtech extortion/surveillance for vendors while everyone else pays for the juice and hardware degradation.

dakolliyesterday at 9:46 PM

So you're running an llm to do data transformation that deterministic processes would be much better suited for and running 1,000 watt power supply to do so. Wild.