how do you know it's benchmaxxed?
For one, they aren't using the latest version of many of the benchmarks. eg, ARC-AGI 2 and not 3, etc.
meta's benchmaxing tendencies are well known. llama4 was mega benchmaxxed, there's nothing that suggests to me that meta's culture has changed.
Friends at Meta with access to the model + personal experience at Meta.
Meta's performance process is essentially "show good numbers or you're out." So guess what people do when they don't have good numbers? They fudge them. Happens all across the company.