If that was a jab it my writing then yes, I am absolutely being sincere because I am an expert on this topic. LLMs went from being ok at one-shoting a function a to being so good at hacking that it's difficult to evaluate them. Prospective customers get back to us after a demo and tell us about the exploits it found on their services that are so vague and technical that they wouldn't think to look for them.