logoalt Hacker News

zahlman06/25/20251 replyview on HN

... So if the installer isn't going to ignore the version requirements, and thereby install an unsupported package that causes a breakage, then there isn't a problem with "scripts being broken because people didn't pin their dependencies". The packages listed in the PEP 723 metadata get installed by an installer, which resolves the listed (unpinned) dependencies to concrete ones (including transitive dependencies), following rules specified by the packages.

I thought we were talking about situations in which following those rules still leads to a runtime fault. Which is certainly possible, but in my experience a highly overstated risk. Packages that say they will work with `foolib >= 3` will very often continue to work with foolib 4.0, and the risk that they don't is commonly-in-the-Python-world considered worth it to avoid other problems caused by specifying `foolib >=3, <4` (as described in e.g. https://iscinumpy.dev/post/bound-version-constraints/ ).

The real problem is that there isn't a good way (from the perspective of the intermediate dependency's maintainer) to update the metadata after you find out that a new version of a (further-on) dependency is incompatible. You can really only upload a new patch version (or one with a post-release segment in the version number) and hope that people haven't pinned their dependencies so strictly as to exclude the fix. (Although they shouldn't be doing that unless they also pin transitive dependencies!)

That said, the end user can add constraints to Pip's dependency resolution by just creating a constraints file and specifying it on the command line. (This was suggested as a workaround when Setuptools caused a bunch of legacy dependencies to explode - not really the same situation, though, because that's a build-time dependency for some packages that were only made available as sdists, even pure-Python ones. Ideally everyone would follow modern practice as described at https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-... , but sometimes the maintainers are entirely MIA.)

> Numpy 2.0 is a very recent example that broke most code that used numpy.

This is fair to note, although I haven't seen anything like a source that would objectively establish the "most" part. The ABI changes in particular are only relevant for packages that were building their own C or Fortran code against Numpy.


Replies

nomel06/25/2025

> `foolib >= 3` will very often continue to work with foolib 4.0,

Absolute nonsense. It's industry standard that major version are widely accepted as/reserved for breaking changes. This is why you never see >= in any sane requirements list, you see `foolib == 3.*`. For anything you want to work for a reasonable amount of time, you see == 3.4.*, because deprecations often still happen within major versions, breaking all code that used those functions.

show 1 reply