logoalt Hacker News

elAhmoyesterday at 10:52 PM8 repliesview on HN

I don’t know a single person using Mistral models.


Replies

consumer451yesterday at 11:13 PM

Isn't their latest speech to text model SOTA? When I tested it on jargon, it was amazing.

https://news.ycombinator.com/item?id=46886735

show 1 reply
Adrigyesterday at 11:31 PM

I used Ministral for data cleaning.

I was surprised: even tho it was the cheapest option (against other small models from Anthropic) it performed the best in my benchmarks.

show 1 reply
badsectoraculatoday at 12:25 AM

Pretty much all of my LLM usage has been using Mistral's open source models running on my PC. I do not do full agentic coding as when i tried it with Devstral Small 2 it was a bit too slow (though if i could get 2-3 times the speed of my PC from a second computer it'd be be a different story and AFAIK that is doable if i was willing to spend $2-3k on it). However i've used Mistral's models for spelling and grammar checks[0], translations[1][2], summaries[3] and trying to figure out if common email SPAM avoidance tricks are pointless in the LLM age :-P [4]. FWIW that tool you can see in the shots is a Tcl/Tk script calling a llama.cpp-based command-line utility i threw together some time ago when experimenting with llama.cpp.

I've also used Devstral Small to make a simple raytracer[5][6] (it was made using the "classic" chat by copy/pasting code, not any agentic approach and i did fix bits of it in the process) and a quick-and-dirty "games database" in Python+Flask+Sqlite for my own use (mainly a game backlog DB :-P).

I also use it to make various small snippets, have it generate some boilerplate stuff (e.g. i have an enum in C and want to write a function that prints names for each enum value or have it match a string i read from a json file with the appropriate enum value), "translate" between languages (i had it recently convert some matrix code that i had written in Pascal into C), etc.

[0] https://i.imgur.com/f4OrNI5.png

[1] https://i.imgur.com/Zac3P4t.png

[2] https://i.imgur.com/jPYYKCd.png

[3] https://i.imgur.com/WZGfCdq.png

[4] https://i.imgur.com/ytYkyQW.png

[5] https://i.imgur.com/FevOm0o.png (screenshot)

[6] https://app.filen.io/#/d/e05ae468-6741-453c-a18d-e83dcc3de92... (C code)

[7] https://i.imgur.com/BzK8JtT.png

ainchtoday at 1:13 AM

That's likely because they're chasing enterprise - see deals with HSBC, ASML, AXA, BNP Paribas etc... Given swelling anti-US sentiment and their status as a French 'national champion', Mistral are probably in a strong position for now regardless of model performance, research quality or consumer uptake.

brainlesstoday at 1:46 AM

I'm building a knowledge graph on personal data (emails, files) with Ministral 3:3b. I try with Qwen 3.5:4b as well but mostly Ministral.

Works really well. Extracts companies you have dealt with, people, topics, events, locations, financial transactions, bills, etc.

pelagicAustralyesterday at 10:57 PM

Me neither, they're not ready for prime imo. I have a yearly sub and the product is just orders of magnitude behind Anthropic's offering. I use Code for real world stuff and I am happy with the result, Mistral is just not something I can trust right now.

Fnoordtoday at 12:39 AM

I use them solely.

nimchimpskytoday at 12:24 AM

[dead]