This is a pet peeve of mine at work.
Any and I mean any statistic someone throws at me I will try and dig in. And if I'm able to, I will usually find that something is very wrong somewhere. As in, the underlying data is usually just wrong, invalidating the whole thing or the data is reasonably sound but the person doing the analysis is making incorrect assumptions about parts of the data and then drawing incorrect conclusions.
I've frequently found, over a few decades, that numerical systems are cyclically 'corrected' until results and performance match prior expectations.
There are often more errors. Sometimes the actual results are wildly different in reality to what a model expects .. but the data treatment has been bug hunted until it does what was expected .. and then attention fades away.
It seems to be an ever-present trait of modern business. There is no rigor, probably partly because most business professionals have never learned how to properly approach and analyze data.
Can't tell you how many times I've seen product managers making decisions based on a few hundred analytics events, trying to glean insight where there is none.