For me, practical knowledge comes from trying to figure things out. The more polished and "ELI5" the material is, the less I retain. I've played with quite a few LLM tools that promised to help me "understand anything", but I don't think they help with intuition all that much. For what it's worth, it's not an LLM-specific problem. I like YouTube content like 3Blue1Brown, but I don't think that I retained anything useful from any of it.
I don't question that LLMs are useful for answering questions about codebases, but this is closer to "turn a codebase into a curriculum", and... does that actually work?
I agree with the sentiment on many of these comments. Understanding something is work and that can’t be offloaded to others or even LLMs.
Are those 9.7k real users? I mean, maybe I am too old fashioned, but whenever I tried to use such tools long before AI, it actually didn't help much. It was much easier to read the codebase and find the needed connection on my own.
It reminds me on NX graphs, which are helpful to find the circular dependencies, but other than that, doesn't provide a lot of value as I can see the same kind of structure just looking at the codebase.
Am I doing something wrong with these tools?
The phrase going around the interwebs is "You can outsource your thinking but not your understanding". A phrase that can at times seem like this weird human<>llm endless loop; depending on what you think you understand and what the llm "thinks" to help you understand, it can seem like an LLM also understand. But it does not.
Its clear one can't really think about anything without building a basic understanding about it. Worth stating that these are distinct from learning. But, I would argue that it is important to know what you *have* to understand now and why is that important. An LLM can help you understand a great many things, you just need to know what you are looking for and that is something no artificial intelligence can really *do* for you. Trial and error, building a sense of self awareness, and talking to people is a better way to know what this is especially for fairly open ended problems.
Did anyone actually use this on a complex codebase and have any kind of intuition for it ?
Like, having looked at the demo, it feels less intuitive and extra complex than going through the codebase myself with tmux + codex + reading it myself. I think for you to understand the codebase, it should be easier to interact without. This seems to introduce way too many steps to interact with the codebase
is this like Obsidian's graph view? Looks pretty/makes cool screenshots but has no actual value and is just cumbersome to use? (btw, this isn't meant to be a mean comment, just a question after looking at the output.)
What evidence is there that this makes any difference at all? There are a gazillion (and one) codebase understanding solutions using knowledge graphs. How do I know if it's any good compared to just using Codex or Claude Code?
Interesting approach. I built something similar https://github.com/nilbuild/diffity to understand unknown codebases. The difference is it gives you the interactive walk-through with mermaid diagrams, guiding you through the feature or part of the codebase that you're looking at.
A big az graph with 100s of spagatti nodes is the kind of learning I try to avoid. Is better to just ask directly, "where do i start?", "teach me about...". This is over engineered education.
Provocative title, then seeing the like 8+ dot folders in the repo really made this seem like some kind of obscure satire at first.
I'm exhausted by these shiny vibe coded projects that overpromise and underdeliver.
Knowledge comes from doing the hard work, not from being spoon fed information. All these fancy graphs represent a tentative mental model produced as a result of research and learning. Everyone's model is different based on their own experience and focus, so trying to present it as a unique map will more than likely not be conducive to understanding at all. Besides the fact that it will almost certainly miss important details or be hallucinated.
HN users: stop upvoting and promoting this garbage. HN mods: please give us tools to label and filter this content.
[flagged]
I was talking to a teacher and she was explaining how everyone is reaching for AI to have everything explained to them. "I'm too dumb to understand things," is the basic assumption people are now growing up with, reaching for AI summaries all the time without trying to understand anything themselves.
Instead of trying to understand things, people are reaching for better tools to have the thinking done for them. We are losing something huge.