I don't think you're right about that, how could it work at all? Are you even sure how it works? I was able to scan all 21 perfectly, less than 1 second each, on iOS 18 a couple of inches from retina display. I have no idea how it works, but it seems pretty robust to me.
I know people can't do it on some Android phones, and apps. Okay. But still seems pretty good and not a fundamental flaw. Absolutely not OK to blanket recommend against trying that. Why limit anyone? Just encourage everyone to try for themselves. Get more data. Premature to make these negative recommendations on 1 test. Rather than short-circuiting further curiosity and exploration, expand it!
> I know people can't do it on some Android phones, and apps. Okay. But still seems pretty good and not a fundamental flaw.
Might not seem like one to you, since you've got a shiny iPhone running the latest iOS, maybe fundamentally it actually is? They're pretty and all, but accessibility and not being ableist is kind of a thing?
You were scanning a screen, which is perfectly flat, has no crumbling and no smudges. You are using a high resolution camera close to a high resolution screen, on a relatively large QR code. You have a best case scenario, and if that's all you care about, you'll be happy with it.
For now, anyway. If you are using the same scanner implementation that the author used to check their results, then the good results you are seeing are due to the generation algorithm being tuned to work with that particular implementation. But there's no guarantee that the implementation will stay the same in some future iOS. Say Apple makes some change, to better recognise bleached codes or something, and then suddenly some of the drop shadows are interpreted as 1's instead of 0's. They could do that, because that wouldn't require changing anything for codes that follow the spec. But it might break that pretty, pretty code that you just put on a million milk cartons.