logoalt Hacker News

dvttoday at 2:38 AM7 repliesview on HN

So weird/cool/interesting/cyberpunk that we have stuff like this in the year of our Lord 2026:

   ├── MEMORY.md            # Long-term knowledge (auto-loaded each session)
   ├── HEARTBEAT.md         # Autonomous task queue
   ├── SOUL.md              # Personality and behavioral guidance
Say what you will, but AI really does feel like living in the future. As far as the project is concerned, pretty neat, but I'm not really sure about calling it "local-first" as it's still reliant on an `ANTHROPIC_API_KEY`.

I do think that local-first will end up being the future long-term though. I built something similar last year (unreleased) also in Rust, but it was also running the model locally (you can see how slow/fast it is here[1], keeping in mind I have a 3080Ti and was running Mistral-Instruct).

I need to re-visit this project and release it, but building in the context of the OS is pretty mindblowing, so kudos to you. I think that the paradigm of how we interact with our devices will fundamentally shift in the next 5-10 years.

[1] https://www.youtube.com/watch?v=tRrKQl0kzvQ


Replies

backscratchestoday at 8:08 AM

Yes this is not local first, the name is bad.

show 3 replies
halJordantoday at 3:12 AM

You absolutely do not have to use a third party llm. You can point it to any openai/anthropic compatible endpoint. It can even be on localhost.

show 1 reply
__mharrison__today at 7:19 AM

I'm playing with local first openclaw and qwen3 coder next running on my LAN. Just starting out but it looks promising.

atmanactivetoday at 3:10 AM

> but I'm not really sure about calling it "local-first" as it's still reliant on an `ANTHROPIC_API_KEY`.

See here:

https://github.com/localgpt-app/localgpt/blob/main/src%2Fage...

show 1 reply
fy20today at 5:49 AM

> Say what you will, but AI really does feel like living in the future.

Love or hate it, the amount of money being put into AI really is our generation's equivalent of the Apollo program. Over the next few years there are over 100 gigawatt scale data centres planned to come online.

At least it's a better use than money going into the military industry.

show 4 replies
jazzyjacksontoday at 6:28 AM

IMHO it doesn't make sense, financially and resource wise to run local, given the 5 figure upfront costs to get an LLM running slower than I can get for 20 USD/m.

If I'm running a business and have some number of employees to make use of it, and confidentiality is worth something, sure, but am I really going to rely on anything less then the frontier models for automating critical tasks? Or roll my own on prem IT to support it when Amazon Bedrock will do it for me?

show 2 replies
IhateAItoday at 4:57 AM

[dead]