logoalt Hacker News

saghm10/11/20241 replyview on HN

The readme seems to indicate that it expects pytorch alongside several other Python dependencies in a requirements.txt file (which is the only place I can find any form of the word "dependency" on the page). I'm very confused by the characterization in the title here given that it doesn't seem to be claimed at all by the project itself (which simple has the subtitle "Minimal LLM inference in Rust").

From the git history, it looks like the username of the person who posted this here is someone who has contributed to the project but isn't the primary author. If they could elaborate on what exactly they mean by saying this has "zero dependencies", that might be helpful.


Replies

littlestymaar10/11/2024

> The readme seems to indicate that it expects pytorch alongside several other Python dependencies in a requirements.txt file

That's only if you want to convert the model yourself, you don't need that if you use the converted weights on the author's huggingface page (in “prepared-models” table of the README).

> From the git history, it looks like the username of the person who posted this here is someone who has contributed to the project but isn't the primary author.

Yup that's correct, so far I've only authored the dioxus GUI app.

> If they could elaborate on what exactly they mean by saying this has "zero dependencies", that might be helpful.

See my other response: https://news.ycombinator.com/item?id=41812665

show 1 reply