logoalt Hacker News

bashbjorn01/22/20251 replyview on HN

I'm working on a plugin[1] that runs local LLMs from the Godot game engine. The optimal model sizes seem to be 2B-7B ish, since those will run fast enough on most computers. We recommend that people try it out with Gemma 2 2B (but it will work with any model that works with llama.cpp)

At those sizes, it's great for generating non-repetitive flavortext for NPCs. No more "I took an arrow to the knee".

Models at around the 2B size aren't really capable enough to act a competent adversary - but they are great for something like bargaining with a shopkeeper, or some other role where natural language can let players do a bit more immersive roleplay.

[1] https://github.com/nobodywho-ooo/nobodywho


Replies

Tepix01/22/2025

Cool. Are you aware of good games that use LLMs like this?

show 1 reply