logoalt Hacker News

ac2908/08/20251 replyview on HN

> OP stated quite clearly their goal was to run models locally.

Fair, but at the point you trust Amazon hosting your "local" LLM, its not a huge reach to just use Amazon Bedrock or something


Replies

motorest08/09/2025

> Fair, but at the point you trust Amazon hosting your "local" LLM, its not a huge reach to just use Amazon Bedrock or something

I don't think you even bothered to look at Amazon Bedrock's pricing before doing that suggestion. They charge users per input tokens + output tokens. In Amazon Bedrock, a single chat session involving 100k tokens can cost you $200. That alone is a third of OP's total infrastructure costs.

If you want to discuss options in terms of cost, the very least you should do is look at pricing.