Claude Code updates almost every day, sometimes multiple times.
One of these days Anthropic is going to be compromised and we’re all gonna be f*cked.
I cant wait to have no dependencies.
An extreme example is now when I make interactive educational apps for my daughter, I just make Opus use plain js and html; from double pendulums to fluid simulations, works one shot. Before I had hundreds of dependencies.
Luckily with MIT licensed code I can just tell Opus to extract exactly the pieces I need and embed them, and tweaked for my usecase. So far works great for hobby projects, but hopefully in the future productions software will have no dependencies.
A repository search shows 2.2K repos with the text "A Mini Shai-Hulud has Appeared", all created within the past day:
https://github.com/search?q=A%20Mini%20Shai-Hulud%20has%20Ap...
This week I was wondering whether using uv for managing Python versions is a good idea.
From their website [1]
> Python does not publish official distributable binaries. As such, uv uses distributions from the Astral python-build-standalone project. See the Python distributions documentation for more details.
It points to this GitHub repo https://github.com/astral-sh/python-build-standalone which mentions this other link https://gregoryszorc.com/docs/python-build-standalone/main/r...
If I understand correctly, the source code for building Python is not fetched directly from python.org. Not so sure how secure is that.
I have the same concern for asdf [2]. However, they use pyenv [3] which, I think, feels more official.
Can someone clarify this? Which tool is better/more secure for installing python: uv or asdf?
[1] https://docs.astral.sh/uv/guides/install-python/
[2] https://github.com/asdf-community/asdf-python
[3] https://github.com/pyenv/pyenv/tree/master/plugins/python-bu...
When I was doing Fast.AI Deep Learning course, I was surprised by the number of Python dependencies machine learning projects bring. Web front-end projects were always considered very third-party dependencies heavy, but to me, the machine learning ecosystem looks much more entangled. In addition, unlike web development, which is considered security critical and has over the many years accumulated a lot of wisdom and good security-related practices, machine learning development looks much more ad-hoc, with many common software engineering practices not applied.
For example, at that time, one way to distribute machine learning models was via Python pickles. Which are executable objects with no restriction built in. Models in this format could do anything on a computer where the model was imported. Such an early 'wild-west' ecosystem can definitely make security compromises easier and resulting supply chain attacks more common.
Most of my pip installs come from Claude Code suggesting them now and me just hitting enter. Model was trained months ago, so it has no clue what got compromised this week. We built the worst possible filter for "is this package safe right now".
Bless the Maker and His water.
Thanks to the community for reporting the security issues with PyTorch Lightning 2.6.2 and 2.6.3 - we're actively looking into it.
In the meantime, please use 2.6.1 until we publish 2.6.4.
For more details: https://github.com/Lightning-AI/pytorch-lightning/security/a...
On GitHub, I saw this message from April 20, and I’m a bit confused.
"deependujha hi @thebaptiste, thanks for inquiring. Release of 2.6.2 is blocked due to some internal reasons. Will notify once release is made. "
I'd hate it if they knew of the problem that long ago and didn't warn until now. If someone has more info and can clarify I'd be thankful.
https://github.com/Lightning-AI/pytorch-lightning/issues/216...
Not a security guy here. How did the dependency get compromised, exactly? Did they submit a PR into the main repo at github and it was approved by the maintainers? Or just host compromised versions in other mirrors?
> Running pip install lightning is all that is needed to activate
FYI, pip added cooldowns in 26.1:
* https://discuss.python.org/t/announcement-pip-26-1-release/107108
* https://ichard26.github.io/blog/2026/04/whats-new-in-pip-26.1/
To use: * CLI: pip install --uploaded-prior-to=P1D ...
* Env Var: PIP_UPLOADED_PRIOR_TO=P1D pip install ...
* Config: pip config set global.uploaded-prior-to P1Djust to clarify it's not PyTorch, it's the library for this Lightning AI company?
I was one of probably eight people who played the Emperor: Battle for Dune RTS game, and I always think of the Fremen character sound bite whenever I see the Old Man of the Desert’s true name invoked:
”…for Shai-Hulud!!!”
The nixpkg from unstable seems to be infected as it s 2.6.2 https://search.nixos.org/packages?channel=unstable&include_h...
I'm curious what they do with various kinds of credentials if they get access.
I can see trying to steal crypto, but what do they do if they get some AWS credentials? Try to run some crypto mining instances? Try to use your account for other types of crimes? Or is it mainly trying to steal data and then ask for ransoms?
Advisory, fresh from the owen
https://github.com/Lightning-AI/pytorch-lightning/security/a...
The decision to run all of my experiments in a monorepo with a single uv.lock continues to be validated. I usually only update it a few times a year. It was pinned at 2.6.1 for lightning \o/
Looks like coding is in a downward spiral towards complete chaos
I find this constant churn in the software world to be tiresome. I get it if there is a security update. Or you are building something new; it takes time and a series of updates to reach feature parity on 1.0. But most software is not like that. All these online registries make the problem worse. Any random tool installation pulls in 300 different dependencies.
This is why I have been building, for my own usecases, a new language + compiler + vm that is completely source based. The compiler does not understand linking. You must vendor every single dependency you use, including the standard library, so that it makes its way into the bytecode. The register VM itself is a few thousand lines of freestanding C. Any competent programmer can audit it over a weekend.
v1 deliberately keeps FFI (outside of a bounded set of linux syscalls) outside the current spec as libc has the habit of infecting everything it touches and I want to keep Vm0 freestanding. The last time I compiled the VM, it produced a 70KB binary and supported a loader with structural verification, the entire instruction set using a threaded interpreter, a simple Cheney+MS GC, concurrency via an Erlang-style M:N scheduler working on a single thread, and 20-odd marshaled functions.
Most software in the world does not need anything more than this. Everyone acts as if they are building the next Google.
Always run third party code inside a sandbox
Shai-Hulud strikes again and continues to turn innocent packages into zombies.
Think twice before looking at a package and most importantly, always pin your dependencies.
Is there some string to recursively grep for to know if you have been infected?
Maybe now people can stop blaming npm and realize none of these unreviewed package ecosystem are safe.
something something Safety Requires A Building Code something thing
Am I the only one who thought that by using github links for an dependency source is not a wise thing to do?
Do folk not understand that by doing so, you're enabling modules to maliciously write themselves in to your code?
[flagged]
Another exploit Mythos didn't find. Isn't the god machine kind of failing us?
This might just be the frequency illusion at play, but there seem to have been a number of high-profile supply chain attacks of late in major packages. There are several articles on the first few pages of HN right now with different cases.
Looking back ten years to `left-pad`, are there more successful attacks now than ever? I would suspect so, and surely the value of a successful attack has also increased, so are we actually getting better as a broad community at detecting them before package release? It's a complex space, and commercial software houses should do better, but it seems that whilst there are some excellent commercial products (e.g. CI scan tools), generally accessible, idiot friendly tooling is somewhat lacking for projects which start as hobby/amateur code but end up being a dependency in many other projects.
I've cross-posted my comment from the current SAP supply chain attack thread [0].
[0]: https://news.ycombinator.com/item?id=47964003