Currently you can't run us embedded and I'm not sure how you could sidestep the DSL :/
We're working on putting our grammar in llama's cpp code so that it only outputs grammatically correct HQL. But, even without that it shouldn't be hard or expensive to do.
I wrote a Claude wrapper that had our docs in its context window, it did a good job of writing queries most of the time.
Currently you can't run us embedded and I'm not sure how you could sidestep the DSL :/
We're working on putting our grammar in llama's cpp code so that it only outputs grammatically correct HQL. But, even without that it shouldn't be hard or expensive to do. I wrote a Claude wrapper that had our docs in its context window, it did a good job of writing queries most of the time.