logoalt Hacker News

janalsncmtoday at 5:41 PM5 repliesview on HN

I noticed that this model is multilingual and understands 14 languages. For many use cases, we probably only need a single language, and the extra 13 are simply adding extra latency. I believe there will be a trend in the coming years of trimming the fat off of these jack of all trades models.

https://aclanthology.org/2025.findings-acl.87/


Replies

deprtoday at 8:11 PM

STT services that have been around for longer, like Azure, Google and Amazon, generally require you to request a specific language, and their quality is a lot higher than models that advertise themselves as LLMs (even though I believe the clouds are also using the same types of models now).

decide1000today at 5:51 PM

I think this model proves it's very efficient and accurate.

raincoletoday at 8:11 PM

Imagine if ChatGPT started like this and thought they should trim coding abilities from their language model because most people don't code.

popalchemisttoday at 6:27 PM

It doesn't make sense to have a language-restricted transcription model because of code switching. People aren't machines, we don't stick to our native languages without failure. Even monolingual people move in and out of their native language when using "borrowed" words/phrases. A single-language model will often fail to deal with that.

show 1 reply
keeganpoppentoday at 6:51 PM

uhhh i cast doubt on multi-language support as affecting latency. model size, maybe, but what is the mechanism for making latency worse? i think of model latency as O(log(model size))… but i am open to being wrong / that being a not-good mental model / educated guess.

show 2 replies