F# is a far better option from a practical standpoint when compared to alternatives. By simple virtue of using .NET and having access to very wide selection of libraries that make it a non-issue when deciding to solve a particular business case. It also has an alternate compiler Fable which can target JS allowing the use of F# in front-end.
Other options have worse support and weaker tooling, and often not even more open development process (e.g. you can see and contribute to ongoing F# work on Github).
This tired opinion ".net bad because microsoft bad" has zero practical relevance to actually using C# itself and even more so F# and it honestly needs to die out because it borders on mental illness. You can hate microsoft products, I do so too, and still judge a particular piece of techology and the people that work on it on their merits.
I was asking about whether or not OCaml has the ability to target specific integer size and signedness, as F# does. I would like to construct precise software that targets the specific kinds of ints that compilers have historically facilitated microprocessors to process, but using a functional language instead of C or C++.
The F# folks (including Don Syme) did a fantastic job on the early versions of the language (I used it up to ver 2 or early 3), but I am tired of the corporate engine that funds that ecosystem. I now construct my software within an operating system of a different pedigree. Such considerations are important to me, but thanks for sharing your preference. As for me, I hate nothing or no one, but I am as picky as a poor man can be about whom I choose to rely on for my tools.
As for your opinion on the borderlands of mental illness, I'll contact you at your outlook email address should I seek your opinion about such differently-technical topics. But I was only asking if the OCaml compiler can target specific varieties of ints, as the .NET compiler does.