> AfterPack approaches this differently. Instead of layering reversible transforms on top of each other, AfterPack uses non-linear, irreversible transforms — closer to how a hash function works than how a traditional obfuscator works. The output is functionally equivalent to the input, but the transformation destroys semantic meaning in a way that cannot be reversed — even by AfterPack itself. There's no inverse function. No secret key that unlocks the original.
That’s probably fun when trying to analyze bugs occurring in production. :)
JS was never really obfuscated - it wasn't the goal of minification. Minifiers especially struggle with ES6 classes/etc, outputting code that is almost human readable.
Proper obfuscation libraries exist, typically at the cost of a pretty notable amount of performance that I'd wager most are not willing to sacrifice
And like even the best of client-side DRM, everything can be reverse engineered. All the code has been downloaded to the user's machine. It's one of the (IMO terrible) excuses for the SaaSification of all software
Minification is not obfuscation and obfuscation is not security, but no amount of deobfuscation will recover the comments in the source, which are often more insightful than the source itself.
It's a cat and mouse game, it provides the desired level of security for people who use it. It isn't used to prevent people from finding vulnerabilities (not mostly at least). It's used to deter competition, prevent clones of the application,etc.. it's make-shift "DRM". There are ways to defeat even AI-assisted analysis running in a proper browser. But I think it's not a good idea to give anyone ideas on this subject. proper-DRM is hellish enough.
Was there ever an obfuscated JS code a human couldn't reverse given enough time? It's like most people's doors, it won't stop someone with a battering ram, but it will ideally slow them down enough for you to hide or get your guns. in this case, it won't even slow them down, until it does (hence: cat and mouse game).
JavaScript code is the essence of minified security.
Huh? Their justification for "ofuscation isn't security" is by pointing out that the Claude source wasn't obfuscated, it was minified. And it could be "deobfuscated by claude itself" - even though, again, they said the code wasn't obfuscated.
So I guess, ask Claude to deobfuscate some code that's ACTUALLY OBFUSCATED if you want to claim obfuscation provides ZERO additional security.
>We analyzed this file at AfterPack as part of a deobfuscation case study. What we found: it's minified, not obfuscated.
>Here's the difference. Minification — what every bundler (esbuild, Webpack, Rollup) does by default — shortens variable names and removes whitespace. It makes code smaller for shipping. It was never designed to hide anything.
>Here's where it gets interesting. We didn't need source maps to extract Claude Code's internals. We asked Claude — Anthropic's own model — to analyze and deobfuscate the minified cli.js file.
I successfully did this the other day. There was a web app I used quite a bit with an annoying performance issue (in some cases its graphics code would spin my CPU at 100% constantly, fans full-blast). I asked Claude to fetch the code and fed it a few performance traces I took through Firefox, and it cut through all those obfuscated variables like they weren't even there, easily re-interpreting what each function actually did, finding a plausible root cause and workaround (which worked).
Can you generally trust it to de-obfuscate reliably? No idea. My sample size is 1.
The _any_ part is not clear to me. Obfuscation is an arms race. Reverse engineers have always been tool-assisted. Now they just have new tools and the obfuscators need to catch up.
And read through native code as well
> No one talks about this. There's no VentureBeat headline about GitHub shipping email addresses in their JS bundles. No Hacker News thread about internal URLs exposed in Anthropic's CDN scripts
That's a huge sign none of that information is truly sensitive. What is being implied here?
> AI Makes This Urgent
No it doesn't. This is blogspam and media hype nobody is interested in. Unless the demographics have really shifted that much in the last few years, HN is one of the worst places to attempt this marketing style.
slight historical note, it might be interesting to see how the brief period of "white box cryptography" stands up to AI today. At the time there were a few companies with products that had trouble finding fit (for straightforward security reasons) but they were essentially commercial obfuscators that made heavy use of lookup tables, miniature virtual machines, and esolang concepts that worked mainly against human reverse engineers.
An example was this early AES proposal: https://link.springer.com/chapter/10.1007/3-540-36492-7_17
write your blog yourself if ppl are supposed to read it not this llm slop
[dead]
Nicholson entered the mantrap and the double doors closed behind him. He emptied his pockets and disrobed before donning the clean suit that had been provided to him by the orderlies. The camera watching him appeared satisfied that he was properly prepared and, more to the point, that the vendor was properly protected. The doors to the inner chamber opened and he proceeded into the hallway. He passed several doors until he reached the one that was labeled with the name of the vendor. He pressed the button on the doorframe. A satisfying tactile click, a spinning light illuminating around the button, a click, and then the door opened soundlessly. A single desk with a small chair and a computer terminal awaited him. He sat down and the screen turned on automatically. Finally, he was able to set about classifying his expenses from a recent trip to Tokyo. It was inconvenient, but a small price to pay to ensure that the vendor’s unique interfaces, their intellectual property, couldn’t be copied by the replication machines. Their eyes and their ears were everywhere in the outside world. Simply by seeing your software, these machines could copy its essence. The risks of operating software in the wild required that proprietary software be protected. Hidden away from eavesdroppers. Such was the world in 2037.