logoalt Hacker News

gbnwltoday at 5:03 AM2 repliesview on HN

I didn't express this well but my interest isn't "who is in the top spot", and is more _why and _how various labs get the results they do. This is also magnified by the fact that I'm not only interested in hosted providers of inference but local models as well. What's your take on the best model to run for coding on 24GB of VRAM locally after the last few weeks of releases? Which harness do you prefer? What quants do you think are best? To use your sports metaphor it's more than following the national leagues but also following college and even high school leagues as well. And the real interest isn't even who's doing well but WHY, at each level.


Replies

yorwbatoday at 8:03 AM

The technical report discussing the why and how is here: https://huggingface.co/deepseek-ai/DeepSeek-V4-Pro/blob/main...

renticuloustoday at 6:08 AM

Follow the AI newsletters. They bundle the news along with their Op-Ed and summarize it better.

show 2 replies