Google docs are pretty clear (https://developers.google.com/crawling/docs/robots-txt/robot...):
> Google's crawlers treat all 4xx errors, except 429, as if a valid robots.txt file didn't exist. This means that Google assumes that there are no crawl restrictions.
This is a better source than a random SEO dude with a channel full of AI-generated videos.
Google Adsense docs says that ads.txt is not mandatory and yet I remember having no ads displayed on my website until I added one.
Not entirely unlikely this is just a bug on Google's end.
It's fairly common for there to be a very long and circuitous route between cause and effect in search, so a bug like this can sometimes be difficult to identify until people start making blog posts about it.