logoalt Hacker News

segmondylast Thursday at 3:50 AM1 replyview on HN

This is about more. I can run 600B+ models at home. Today I was having a discussion with my wife and we asked ChatGPT a quick question, it refused because it can't generate the result based on race. I tried to prompt it to and it absolutely refused. I used my local model and got the answer I was looking for from the latest Mistral-Large3-675B. What's the cost of that?


Replies

nicman23last Thursday at 7:06 AM

about the cost of your hardware lol