PYX: The next step in Python packaging
463 comments
·August 13, 2025monster_truck
woodruffw
For what it's worth, I understand this concern. However, I want to emphasize that pyx is intentionally distinct from Astral's tools. From the announcement post:
> Beyond the product itself, pyx is also an instantiation of our strategy: our tools remain free, open source, and permissively licensed — forever. Nothing changes there. Instead, we'll offer paid, hosted services that represent the "natural next thing you need" when you're already using our tools: the Astral platform.
Basically, we're hoping to address this concern by building a separate sustainable commercial product rather than monetizing our open source tools.
abdullahkhalids
I believe that you are sincere and truthful in what you say.
Unfortunately, the integrity of employees is no guard against the greed of investors.
Maybe next year investors change the CEO and entire management and they start monetizing the open source tools. There is no way of knowing. But history tells us that there is a non-trivial chance of this happening.
dandellion
Yes, at every company I have been with investors, sooner or later they make the calls, what the founders wanted never has much weight a few years in.
adr1an
A week later a fork appears and everyone is happy (except those investors, CEO, and management, but it never was about their happiness so...)
wrasee
The uncertainly over a future rug pull is always real, but in reality I wonder if the actual reason for people's hesitancy is more than just that. In reality I suspect it's closer to one of simply identity and the ownership model itself. Just the very idea that core tooling you depend on is in the hands of a commercial company is enough to many back off in a way one might not be when the tooling is in the hands of a broader community that one can support on more equal terms.
@woodruffw I love your quote above that commits you to your open source base and I'm rooting for you. But how about an approach that commits you to this sentence in a more rigorous and legal way, and spin off your open source tooling to a separate community-based entity? Of course, upon that you can continue to maintain sufficient representation to make Astral's commercial products the natural progression and otherwise the model remains the same. That would be a significant transfer of control, but it is that very transfer that would get a overwhelming response from the community and could really unblock these great tools for massive growth.
I work a lot with LLVM/Clang and whilst i know Apple and Google are significant contributors I feel confident that LLVM itself exists outside of that yet accept that e.g. Apple's contributions afford them weight to steer the project in ways that match their interests in e.g. Swift and Apple tooling.
BrenBarn
It makes sense, but the danger can come when non-paying users unwittingly become dependent on a service that is subsidized by paying customers. What you're describing could make sense if pyx is only private, but what if there is some kind of free-to-use pyx server that people start using? They may not realize they're building on sand until the VC investors start tightening the screws and insist you stop wasting money by providing the free service.
(Even with an entirely private setup, there is the risk that it will encourage too much developer attention to shift to working within that silo and thus starve the non-paying community of support, although I think this risk is less, given Python's enormous breadth of usage across communities of various levels of monetization.)
physicsguy
Conda said all this as well, and solved the same issues you’re trying to - namely precompiled versions of difficult to build packages. It then went commercial.
In the HPC space there are already Easybuild and Spack which make all the compiler tool chain and C and Fortran library dependency stuff very easy. They just haven’t taken off outside as they aim to solve cluster management problems but Spack is easy to self serve.
igortg
Yes, it went commercial. But Conda is still free. And the community had built conda-forge to maintain packages. Conda + conda-forge can completly replace Anaconda for most people.
So even though they went commercial, they left pretty good things behind for the Open source community.
o11c
The entire reason people choose "permissive licenses" is so that it won't last forever. At best, the community can fork the old version without any future features.
Only viral licenses are forever.
woodruffw
I don't think this is true -- a license's virality doesn't mean that its copyright holders can't switch a future version to a proprietary license; past grants don't imply grants to future work under any open source license.
actinium226
By viral, do you mean licenses like GPL that force those who have modified the code to release their changes (if they're distributing binaries that include those changes)?
Because FWIW CPython is not GPL. They have their own license but do not require modifications to be made public.
bigstrat2003
This is just plain false and honestly close-minded. People choose permissive licenses for all sorts of reasons. Some might want to close it off later, but lots of people prefer the non-viral nature of permissive licenses, because it doesn't constrain others' license choice in the future. Still others think that permissive licenses are more free than copyleft, and choose them for that reason. Please don't just accuse vast groups of people of being bad-faith actors just because you disagree with their license choice.
krupan
I think you are making a good point, but please don't use the old Steve Baller FUD term, "viral." Copyleft is a better term
thayne
Or they want to get more people to use it.
zemo
> Basically, we're hoping to address this concern by building a separate sustainable commercial product rather than monetizing our open source tools.
jfrog artifactory suddenly very scared for its safety
mcdow
Only a matter of time before someone makes something better than Artifactory. It’s a low bar to hit imho.
cyberax
They've been the only game in town for a while, and their pricing reflects it. But this project is only for Python (for now?) so JFrog is not _immediately_ in danger.
threatofrain
The hosting and administration part is what’s expensive and can’t be free and open source except when someone pays for it. So is npm open source? Whether it is or isn't doesn't matter as much as whether Microsoft continues to pay the bill.
zahlman
Will pyx describe a server protocol that could be implemented by others, or otherwise provide software that others can use to host their own servers? (Or maybe even that PyPI can use to improve its own offering?) That is, when using "paid, hosted services like pyx", is one paying for the ability to use the pyx software in and of itself, or is one simply paying for access to Astral's particular server that runs it?
woodruffw
I might not be following: what would that protocol entail? pyx uses the same PEP 503/691 interfaces as every other Python index, but those interfaces would likely not be immediately useful to PyPI itself (since it already has them).
> or is one simply paying for access to Astral's particular server that runs it?
pyx is currently a service being offered by Astral. So it's not something you can currently self-host, if that's what you mean.
sevensor
It’s great that astral are doing what they do, but it’s important to hedge against their success. We must maintain ecosystem fragmentation, or Astral becomes a choke point where a bad actor can seize the entire Python community and extract rent from it. Like Anaconda but successful. So keep using pylint, keep using mypy, keep using PyPI. Not exclusively, but as a deterrent against this kind of capture.
crystal_revenge
I love using uv, but having worked for a VC funded open source startup, your concerns are spot on.
As soon as there is a commercial interest competing with the open source project at the same company the OSS version will begin to degrade, and often the OSS community will be left in the dark about this. The startup I was at had plenty of funding, far too many engineers, and still removed basically every internal resource from the oss project except one person and drove out everyone working on the community end of things.
I would also recommend avoiding working for any open source startup if your goal is to get paid to contribute to a community project. Plenty of devs will take a reduced salary to work on a great community project, but most of the engineers I saw definitely got the "bait and switch" and moved immediately to commercial projects.
mnazzaro
This is a valid concern, but astral just has an amazing track record.
I was surprised to see the community here on HN responding so cautiously. Been developing in python for about a decade now- whenever astral does something I get excited!
halfcat
> This is a valid concern, but astral just has an amazing track record.
The issue is, track record is not relevant when the next investors take over.
IshKebab
I agree in principle, but in this case uv is open source and SO MUCH better than pip it would be insane not to use it on those grounds.
With uv the worst case is it goes closed source in a few years and we all switch to a fork.
With pip the best case is that maybe in 10 years they have improved it to fix all the issues that uv has already fixed, and you only spend 10 years using shitty software and putting up with endless bugs and paper cuts.
JimDabell
Frankly, it’s weird. You can find this business model all over the open-source world but for some reason Astral in particular is singled out for way more criticism on this than anything else I’ve seen, despite being unambiguously great contributors who have never put a foot wrong as far as I can tell.
Microsoft – who invented embrace, extend, and extinguish – own NPM, but I don’t see people wringing their hands over them in every thread that mentions NPM. But you mention Astral here or on Reddit and people line up to tell you it’s only a matter of time before they fuck people over. Why the disparity?
aragilar
NPM has always been commercial (rather than managed by a foundation), and it was nominally acquired by GitHub rather than Microsoft, so at some level as long as GitHub is not causing issues (noting the recent GitHub changes should maybe also imply some consideration of problems for NPM), NPM is "safe".
Astral on the other hand has basically been rewrites in Rust of existing community-based open source tools, for which there is always the question of how such work is funded. PYX (which is an interesting choice of name given the conflicts with pyrex/cython filenames) from what we can see here appears to be in a similar vein, competing with PyPI and making changes which seemingly require their client (uv) be used.
Anaconda/ContinuumIO was also treated with similar suspicion to Astral, so I don't think it's Astral in particular, it's more they both are operating in the part of the ecosystem where it is comparatively easy to lock out community-based open source tools (which the Python ecosystem appears to have been better at setting up and maintaining than the JS ecosystem).
unmole
HN comments default to cynisim.
nerdponx
Has that ever happened in the Python ecosystem specifically? It seems like there would be a community fork led by a couple of medium-size tech companies within days of something like that happening, and all users except the most enterprise-brained would switch.
null
ActorNightly
I agree. If any of the stuff was worthwhile to pursue, it would be merged into pip.
zahlman
Pyx represents the server side, not the client side. The analogue in the pre-existing Python world is PyPI.
Many ideas are being added to recent versions of pip that are at least inspired by what uv has done — and many things are possible in uv specifically because of community-wide standards development that also benefits pip. However, pip has some really gnarly internal infrastructure that prevents it from taking advantage of a lot of uv's good ideas (which in turn are not all original). That has a lot to do with why I'm making PAPER.
For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process. It remembers entire wheels, but finding them in that cache requires knowing the URL from which they were downloaded. Because PyPI organizes the packages in its own database with its own custom URL scheme, pip would have to reach out to PyPI across the Internet in order to figure out where it put its own downloads!
notatallshaw
> However, pip has some really gnarly internal infrastructure that prevents it from taking advantage of a lot of uv's good ideas (which in turn are not all original).
FWIW, as a pip maintainer, I don't strongly agree with this statement, I think if pip had the same full time employee resources that uv has enjoyed over the last year that a lot of these issues could be solved.
I'm not saying here that pip doesn't have some gnarly internal details, just that the bigger thing holding it back is the lack of maintainer resources.
> For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache. For pip to follow suit, it would have to completely redo its caching strategy from the ground up, because right now its cache is designed to save only download effort and not anything else about the installation process.
I actually think this isn't a great example, evidenced by the lack of a download or wheel command from uv due to those features not aligning with uv's caching strategy.
That said, I do think there are other good examples to your point, like uv's ability to prefetch package metadata, I don't think we're going to be able to implement that in pip any time soon due to probably the need for a complete overhaul of the resolver.
BrenBarn
> For just one example: uv can quickly install previously installed packages by hard-linking a bunch of files from the cache.
Conda has been able to do this for years.
woodruffw
This doesn't generalize: you could have said the same thing about pip versus easy_install, but pip clearly has worthwhile improvements over easy_install that were never merged back into the latter.
benrutter
I'm assuming by pip you mean pypi (the package registry) - I think you're making the mistake of thinking it has a lot more resources than it does, because of how much it punches above its weight.
Pypi is powered by warehouse which has around 3 developers maintaining it[0]. They're doing an awesome job, but the funding and resource available to them are probably substantially less than Astral could have with a paid offering.
nilamo
Pip is broken and has been for years, they're uninterested in fixing the search. Or even removing the search or replacing it with a message/link to the package index.
imo, if pip's preference is to ship broken functionality, then what is/is not shipped with pip is not meaningful.
woodruffw
This is not a charitable interpretation. The more charitable read is that fixing search is non-trivial and has interlocking considerations that go beyond what pip's volunteer maintainers reasonably want to or can pick up.
(And for the record: it isn't their fault at all. `pip search` doesn't work because PyPI removed the search API. PyPI removed that API for very good reasons[1].)
sharpshadow
Your comment gave me an flashback to when I started programming and drag and dropped downloaded python packages into the packages folder instead of installing them.
lijok
It’s not that complex - just try it
rmonvfer
To be honest, this was just a matter of time. As a long time Python developer, I just can’t wrap my head around the lack of something like this. GitHub was going to get hosted packages for Python but never did because it “didn’t align with their strategy objectives and a reallocation of resources” [1] (or some other similar corpospeak) Astral is a great company and I think we can’t question what they’ve achieved and provided to the Python community. uv is a game changer and solves one of the core issues with Python by providing a unified tool that’s also fast, reliable and easy to use. In fact, after using uv for the first time (coming from a combination of pyenv + poetry) I never wanted to go back and this is something all of my peers have experienced too. I’m glad it’s Astral who is doing this, and of course they will have to make money one way or another (which is perfectly fine and I don’t think anyone on this forum can be against that, as long as they are actually providing real value) but I was honestly tired of the paralysis on this matter. I did try to build a registry (pyhub.net) but being just one person with almost no resources and having another full time business made it impossible. Anyway, congrats to the team for the effort! [1] https://github.com/orgs/community/discussions/8542
westurner
Is this problem also solved by storing software artifacts in OCI container image registries that already support SLSA-compliant TUF signatures?
miraculixx
Anaconda solved the same problem ~10+ years ago already.
in9
HAHAHAH don't even get me started on how bad anaconda is. On how slow the installer + interpreter, how they avoided being a part of the usual pip workflow, bloated environment, cross platform inconsistencies, extremely slow dependency resolution, etc etc etc...
tylfin
Posit has solved similar problems with their Package Manager as well, the benefit being that it's hosted on-prem, but the user has to build wheels for their desired architecture (if they're not on pypi).
hexo
to be honest. ill never use uv. python ecosystem tools should be in python.
jpambrun
This is very close minded. It's best to avoid statements like that.
I feel like having a working python environment is not a great requirement to managing your python environment.
gryn
is this ragebait ?
most of the stuff in the python ecosystem have a core built in C, including the language itself.
Kevcmk
Definitely has never seen/used uv
cma256
Wait until you find out what language Python is written in.
LucasOe
Why?
mbonnet
this is the kind of statement that will get anyone who works for me PIPd
ljm
This is quite a disappointing self-limitation given the improvements uv brings to the table. You’re missing out on some good stuff.
runningmike
All python packaging challenges are solved. Lesson learned is that there is not a single solution for all problems. getting more strings attached with VC funded companies and leaning on their infrastructure is a high risk for any FOSS community.
bastawhiz
Well I started with pip because it's what I was told to use. But it was slow and had footguns. And then I started using virtualenv, but that only solved part of the problem. So I switched to conda, which sometimes worked but wrecked my shell profile and often leads to things mysteriously using the wrong version of a package. So someone told me to use pipenv, which was great until it was abandoned and picked up by someone who routinely broke the latest published version. So someone told me to use poetry, but it became unusably slow. So I switched back to pip with the built-in venv, but now I have the and problems I had before, with fewer features. So I switched to uv, because it actually worked. But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.
I'm so glad all the Python packaging challenges are "solved"
Eduard
I started with "sudo apt install python" a long time ago and this installed python2. This was during the decades-long transition from python2 to python3, so half the programs didn't work so I installed python3 via "sudo apt install python3". Of course now I had to switch between python2 and python3 depending on the program I wanted to run, that's why Debian/Ubuntu had "sudo update-alternatives --config python" for managing the symlink for "python" to either python2 or python3. But shortly after that, python3-based applications also didn't want to start with python3, because apt installed python3.4, but Python developers want to use the latest new features offered by python3.5 . Luckily, Debian/Ubuntu provided python3.5 in their backports/updates repositories. So for a couple of weeks things sort of worked, but then python3.7 was released, which definitely was too fresh for being offered in the OS distribution repositories, but thanks to the deadsnakes PPA, I could obtain a fourth-party build by fiddling with some PPA commands or adding some entries of debatable provenance to /etc/apt/lists.conf. So now I could get python3.7 via "sudo apt install python3.7". All went well again. Until some time later when I updated Home Assistant to its latest monthly release, which broke my installation, because the Home Assistant devs love the latest python3.8 features. And because python3.8 wasn't provided anymore in the deadsnakes PPA for my Ubuntu version, I had to look for a new alternative. Building python from source never worked, but thank heavens there is this new thing called pyenv (cf. pyenv), and with some luck as well as spending a weekend for understanding the differences between pyenv, pyvenv, venv, virtualenv (a.k.a. python-virtualenv), and pyenv-virtualenv, Home Assistant started up again.
This wall of text is an abridged excursion of my installing-python-on-Linux experience.
There is also my installing-python-on-Windows experience, which includes: official installer (exe or msi?) from python.org; some Windows-provided system application python, installable by setting a checkbox in Windows's system properties; NuGet, winget, Microsoft Store Python; WSL, WSL2; anaconda, conda, miniconda; WinPython...
ilvez
I understand this is meant as caricature, but for doing local development tools like mise or asdf are really something I've never looked back from. For containers it's either versioned Docker image or compile yourself.
pletnes
I think I have a similar experience in some ways, but building python from source should work on linux in my experience. On a debian ish system I’d expect apt installing build essentials and the libraries you need and you should be good. I’ve done it with some pain on red hat-ish distros, which have tended to ship with python versions older than I’ve experience with. (I guess it’s better these days..?)
progval
I started at about the same time you did, and I've never seen an instance of software expecting a Python version newer than what is in Debian stable. It happens all the time for Nodejs, Go, or Rust though.
hulitu
Your comment shows the sad state of software quality those days. Rust is the same, move fast and break things. And lately also Mesa started to suffer from the same disease. You basically need, those days, the same build env like the one on the developer's machine or the build will fail.
seriocomic
I've walked the same rocky path and have the bleeding feet to show for it! My problem is that now my packaging/environment mental model is so muddled I frequently mix up the commands...
integralid
What's wrong with just using virtualenv. I never used anything else, and I never felt the need to. Maybe it's not as shipping l shiny as the other tools, but it just works.
rcleveng
The problem is you can do whatever you want in it, and then have no way of reproducing that.
pyproject.toml tries to fix it, poetry kept wanting to use their own wonky names for the tags, I'm not sure why.
Once that is standardized, venvs should be cattle and not pets. That is all that is needed. UV makes that fast by hardlinking in the libraries and telling you the obvoius (that venvs should be cattle and not pets)
This fight was poetry's to loose.
divbzero
There’s nothing wrong with just using virtualenv. I too have used virtualenv plus pip (and sometimes pyenv) for the longest time without issue.
However, uv is the first alternative that has tempted me to switch. uv offers performance improvements over pip and handles pyenv use cases as well. I’ll be paying attention to pyx to see how it pans out.
joshvm
Nothing is inherently wrong with virtualenv. All these tools make virtual environments and offer some way to manage them. But virtualenv doesn't solve the problem of dependency management.
null
btreecat
> But the dependency I need is built and packaged differently for different operating systems and flavor of GPU, and now my coworkers can't get the project to install on their laptops.
This is why containers are great IMO.
It's meant to solve the problem of "well it works on my machine"
frollogaston
Even the way you import packages is kinda wack
voicedYoda
You forgot the wheels and eggs
smellf
You can have my `easy_install` when you pry it from my cold dead fingers.
yulyavaluy
We actually built a platform that eliminates all these steps, you can now reproduce GitHub repos with 0 manual config in 60% cases. check for more info on https://x.com/KeploreAI We just launched it and waiting for first users to be astonished :), let me know if you have any questions
null
computershit
> All python packaging challenges are solved.
This comes across as uninformed at best and ignorant at worst. Python still doesn't have a reliable way to handle native dependencies across different platforms. pip and setuptools cannot be the end all be all of this packaging ecosystem nor should they be.
_the_inflator
„across different platforms“
First things first:
Import path, os
I love Python, the ZEN of it, and you really need to accept the fact that there are conventions - quite a lot and that bash or shell scripts are where the magic happens, like environmental variables, if you know how to secure your app.
Even the self thing finally makes sense after years of bewilderment (“Wait: not even Java is that brutal to its users.”)
Lately stumbled over poetry after really getting the gist out of venv and pip.
Still hesitant, because Windows doesn’t play a role.
benreesman
Try doing CUDA stuff. It's a chemical fire. And the money would make solving it would fund arbitrary largesse towards OSS in perpetuity.
dirkc
I see VC money as an artificial force propping up a project. It is not bad per se, but VC money is not a constant and it leaves a big drop at the end. If there is a big enough community that has grown around the project, that drop might be okay.
tempest_
I share your concern but I have saved so much time with uv already that I figure ill ride it till the VC enshitification kills the host.
Hopefully at the point the community is centralized enough to move in one direction.
NegativeLatency
I've been heartened by the progress that opentofu has made, so I think if it gets enough momentum it could survive the inevitable money grab
anitil
I agree, now I just use uv and forget about it. It does use up a fair bit of disk, but disk is cheap and the bootstrapping time reduction makes working with python a pleasure again
alisonatwork
I recently did the same at work, just converted all our pip stuff to use uv pip but otherwise no changes to the venv/requirements.txt workflow and everything just got much faster - it's a no-brainer.
But the increased resource usage is real. Now around 10% of our builds get OOM killed because the build container isn't provisioned big enough to handle uv's excessive memory usage. I've considered reducing the number of available threads to try throttle the non-deterministic allocation behavior, but that would presumably make it slower too, so instead we just click the re-run job button. Even with that manual intervention 10% of the time, it is so much faster than pip it's worth it.
nemosaltat
Couldn’t agree more and the `uv run executable.sh` that contains a shebang, imports and then python is just magical.
null
JonChesterfield
I've been dealing with python vs debian for the last three hours and am deeply angry with the ecosystem. Solved it is not.
Debian decided you should use venv for everything. But when packages are installed in a venv, random cmake nonsense does not find them. There are apt-get level packages, some things find those, others do not. Names are not consistent. There's a thing called pipx which my console recommended for much the same experience. Also the vestiges of 2 vs 3 are still kicking around in the forms of refusing to find a package based on the number being present or absent.
Whatever c++headerparser might be, I'm left very sure that hacking python out of the build tree and leaving it on the trash heap of history is the proper thing to do.
nemomarx
from what I hear uv is the "solved" and venv by hand is the old way
pama
These tools together solve a fraction of the problen. The other parts of the problem are interfacing with classic c, c++ libraries and handling different hardware and different OSes. It is not even funny how tricky it is to use the same GPU/CUDA versions but with different CPU architectures and hopefully most people dont need to be exposed to it. Sometimes parts of the stack depends on a different version of a c++ library than other parts of the stack. Or some require different kernel modules or CUDA driver settings. But I would be happy if there was a standardized way to at least link to the same C++ libraries, hopefully with the same ABI, across different clusters or different OS versions. Python is so far from solved…
ghshephard
uv is venv + insanely fast pip. I’ve used it every day for 5+ months and I still stare in amazement every time I use it. It’s probably the most joy I’ve ever gotten out of technology.
frollogaston
pip is the default still
Imustaskforhelp
Uv truly is great, and I mean they are open source and we can always fork it just as how valkey forked redis
And also if you mean that pyx might be hosted on uv, well I think the discussion can go towards that pyx should be made open source but honestly, I am pretty sure that someone might look at pyx and create a pyx api compliant hosted server or I am still curious as to how pyx works and what it actually truly does.
mbonnet
If Python packaging problems are solved, why is Python known for having the worst tooling ecosystem of any "modern" language?
m_kos
> Why is it so hard to install PyTorch, or CUDA, or libraries like FlashAttention or DeepSpeed that build against PyTorch and CUDA?
This is so true! On Windows (and WSL) it is also exacerbated by some packages requiring the use of compilers bundled with outdated Visual Studio versions, some of which are only available by manually crafting download paths. I can't wait for a better dev experience.
giancarlostoro
Stuff like that led me fully away from Ruby (due to Rails), which is a shame, I see videos of people chugging along with Ruby and loving it, and it looks like a fun language, but when the only way I can get a dev environment setup for Rails is using DigitalOcean droplets, I've lost all interest. It would always fail at compiling something for Rails. I would have loved to partake in the Rails hype back in 2012, but over the years the install / setup process was always a nightmare.
I went with Python because I never had this issue. Now with any AI / CUDA stuff its a bit of a nightmare to the point where you use someone's setup shell script instead of trying to use pip at all.
awesome_dude
Lets be honest here - whilst some experiences are better/worse than others, there doesn't seem to be a dependency management system that isn't (at least half) broken.
I use Go a lot, the journey has been
- No dependency management
- Glide
- Depmod
- I forget the name of the precursor - I just remembered, VGo
- Modules
We still have proxying, vendoring, versioning problems
Python: VirtualEnv
Rust: Cargo
Java: Maven and Gradle
Ruby: Gems
Even OS dependency management is painful - yum, apt (which was a major positive when I switched to Debian based systems), pkg (BSD people), homebrew (semi-official?)
Dependency Management is the wild is a major headache, Go (I only mention because I am most familiar with) did away with some compilation dependency issues by shipping binaries with no dependencies (meaning that it didn't matter which version of linux you built your binary for, it will run on any of the same arch linux - none of that "wrong libc" 'fun'), but you still have issues with two different people building the same binary in need of extra dependency management (vendoring brings with it caching problems - is the version in the cache up to date, will updating one version of one dependency break everything - what fun)
giancarlostoro
NuGet for C# has always been fantastic, and I like Cargo, though sometimes waiting for freaking ever for things to build does kill me on the inside a little bit. I do wish Go had a better package manager trajectory, I can only hope they continue to work on it, there were a few years I refused to work on any Go projects because setup was a nightmare.
orhmeh09
I think CRAN for R is very good, partly aided by an aggressive pruning policy for broken packages.
ilvez
Do I get it right that this issue is within Windows? I've never heard of the issues you describe while working with Linux.. I've seen people struggle with MacOS a bit due to brew different versions of some library or the other, mostly self compiling Ruby.
threeducks
There certainly are issues on Linux as well. The Detectron2 library alone has several hundred issues related to incorrect versions of something: https://github.com/facebookresearch/detectron2/issues
The mmdetection library (https://github.com/open-mmlab/mmdetection/issues) also has hundreds of version-related issues. Admittedly, that library has not seen any updates for over a year now, but it is sad that things just break and become basically unusable on modern Linux operating systems because NVIDIA can't stop breaking backwards and forwards compatibility for what is essentially just fancy matrix multiplication.
giancarlostoro
I had issues on Mac, Windows and Linux... It was obnoxious. It led me to adopt a very simple rule: if I cannot get your framework / programming language up and running in under 10 minutes (barring compilation time / download speeds) I am not going to use your tools / language. I shouldn't be struggling with the most basic of hello worlds with your language / framework. I don't in like 100% of the other languages I already use, why should I struggle to use a new language?
jcelerier
On Linux good luck if you're not using anything besides the officially nvidia-supported Ubuntu version. Just 24.04 instead of 22.04 has regular random breakages and issues, and running on archlinux is just endless pain.
selimnairb
Have you tried conda? Since the integration of mamba its solver is fast and the breadth of packages is impressive. Also, if you have to support Windows and Python with native extensions, conda is a godsend.
orhmeh09
It is not fast. Mamba and micromamba are still much faster than conda and yet lack basic features that conda has to provide. Everyone is dropping conda like a hot plate since the licensing changes in 2024.
viraptor
I would recommend learning a little bit of C compilation and build systems. Ruby/Rails is about as polished as you could get for a very popular project. Maybe libyaml will be a problem once in a while if you're compiling Ruby from scratch, but otherwise this normally works without a hassle. And those skills will apply everywhere else. As long as we have C libraries, this is about as good as it gets, regardless of the language/runtime.
nurettin
Have you tried JRuby? It might be a bit too large for your droplet, but it has the java versions of most gems and you can produce cross-platform jars using warbler.
nickserv
The speed of Ruby with the memory management of Java, what's not to love?
Also, now you have two problems.
chao-
I'm surprised to hear that. Ruby was the first language in my life/career where I felt good about the dependency management and packaging solution. Even when I was a novice, I don't remember running into any problems that weren't obviously my fault (for example, installing the Ruby library for PostgreSQL before I had installed the Postgres libraries on the OS).
Meanwhile, I didn't feel like Python had reached the bare minimum for package management until Pipenv came on the scene. It wasn't until Poetry (in 2019? 2020?) that I felt like the ecosystem had reached what Ruby had back in 2010 or 2011 when bundler had become mostly stable.
thedanbob
Bundler has always been the best package manager of any language that I've used, but dealing with gem extensions can still be a pain. I've had lots of fun bugs where an extension worked in dev but not prod because of differences in library versions. I ended up creating a docker image for development that matched our production environment and that pretty much solved those problems.
poly2it
Have you tried Nix?
giancarlostoro
I'm on Arch these days, but nix would have maybe helped, but this was 2010s where as far as I remember, nobody was talking about NixOS.
c0balt
Given they mentioned Windows (and not WSL) that might not be a viable option. AFAIK, Windows is not natively supported by nixpkgs.
bytehumi
This is the right direction for Python packaging, especially for GPU-heavy workflows. Two concrete things I'm excited about: 1) curated, compatibility-tested indices per accelerator (CUDA/ROCm/CPU) so teams stop bikeshedding over torch/cu* matrixes, and 2) making metadata queryable so clients can resolve up front and install in parallel. If pyx can reduce the 'pip trial-and-error' loop for ML by shipping narrower, hardware-targeted artifacts (e.g., SM/arch-specific builds) and predictable hashes, that alone saves hours per environment. Also +1 to keeping tools OSS and monetizing the hosted service—clear separation builds trust. Curious: will pyx expose dependency graph and reverse-dependency endpoints (e.g., "what breaks if X→Y?") and SBOM/signing attestation for supply-chain checks?
int_19h
Given that WSL is pretty much just Linux, I don't see what relevance Visual Studio compiler versions have to it. WSL binaries are always built using Linux toolchains.
At the same time, even on Windows, libc has been stable since Win10 - that's 10 years now. Which is to say, any binary compiled by VC++ 2015 or later is C-ABI-compatible with any other such binary. The only reasons why someone might need a specific compiler version is if they are relying on some language features not supported by older ones, or because they're trying to pass C++ types across the ABI boundary, which is a fairly rare case.
m_kos
If you have to use, e.g., CUDA Toolkit 11.8, then you need a specific version of VS and its build tools for CUDA's VS integration to work. I don't know why exactly that is and I wish I didn't have to deal with it.
morkalork
This was basically the reason to use anaconda back in the day.
setopt
In my experience, Anaconda (including Miniconda, Micromamba, IntelPython, et al.) is still the default choice in scientific computing and machine learning.
NeutralForest
It's useful because it also packages a lot of other deps like CUDA drivers, DB drivers, git, openssl, etc. When you don't have admin rights, it's really handy to be able to install them and there's no other equivalent in the Python world. That being said, the fact conda (and derivatives) do not follow any of the PEPs about package management is driving me insane. The ergonomics are bad as well with defaults like auto activation of the base env and bad dependency solver for the longest time (fixed now), weird linking of shared libs, etc.
IHLayman
Anaconda was a good idea until it would break apt on Ubuntu and make my job that much harder. That became the reason _not_ to use Anaconda in my book.
venv made these problems start to disappear, and now uv and Nix have closed the loop for me.
StableAlkyne
How did it manage to do that?
Not saying it didn't, I've just never ran into that after a decade of using the thing on various Nixes
northzen
Why don't you use pixi, which has the best from these worlds?
miraculixx
Windows is the root cause here, not pip
miohtama
In the past, part of the definition of an operating system was that it ships with a compiler.
int_19h
When was that ever a part of the definition? It was part of the early Unix culture, sure, but even many contemporary OSes didn't ship with compilers, which were a separate (and often very expensive!) piece of software.
OTOH today most Linux distros don't install any dev tools by default on a clean install. And, ironically, a clean install of Windows has .NET, which includes a C# compiler.
null
simonw
This is effectively what Charlie said they were going to build last September when quizzed about their intended business model on Mastodon: https://hachyderm.io/@charliermarsh/113103564055291456
pietroppeter
And this fact effectively builds trust in the vision and in execution.
jsmeaton
Astral folks that are around - there seems to be a bit of confusion in the product page that the blog post makes a little more clear.
> The next step in Python packaging
The headline is the confusing bit I think - "oh no, another tool already?"
IMO you should lean into stating this is going to be a paid product (answering how you plan to make money and become sustainable), and highlight that this will help solve private packaging problems.
I'm excited by this announcement by the way. Setting up scalable private python registries is a huge pain. Looking forward to it!
zanie
Thanks for the feedback!
divbzero
The combination of
– “client (uv) and server (pyx)” and
– “You can use it to host your own internal packages, or as an accelerated, configurable frontend to public sources like PyPI and the PyTorch index.”
is what really helped me understand what pyx aims to be.
IshKebab
I would also put this list of issues that this fixes higher. It makes it more obvious what the point is. (And also a setuptools update literally broke our company CI last week so I was like "omg yes" at that point.)
_verandaguy
Soon: there are 14 competing Python packaging standards.
This is a joke, obviously. We've had more than 14 for years.
woodruffw
Python packaging has a lot of standards, but I would say most of them (especially in the last decade) don't really compete with each other. They lean more towards the "slow accretion of generally considered useful features" style.
This itself is IMO a product of Python having a relatively healthy consensus-driven standardization process for packaging, rather than an authoritative one. If Python had more of an authoritative approach, I don't think the language would have done as well as it has.
(Source: I've written at least 5 PEPs.)
_verandaguy
There are highs and lows to the system, just like with any system. Pip overall was a great package manager like 15 years ago, and a big step up from easy_install for sure (n.b., I started programming around the time easy_install was going out of fashion, so my point of view is coloured by that timing).
That said, it'd be nice if pip (or some PSF-blessed successor) adopted a model more similar to that offered by poetry (and formerly, pipenv, and now, I guess, uv) at least for package locking. `pip freeze > requirements.txt` isn't fit for purpose in the age of supply chain attacks, unfortunately, and PyPI already offers a reasonably workable backend for this model, whether the community at-large agrees to this or not. There are objective benefits (biggest one being better code integrity guarantees) that outweigh the objective (largely, performance-related) drawbacks.
callc
Do you really think Python’s consensus-driven language development is better than authoritarian?
I am honestly tired of the Python packing situation. I breathe a sigh of relief in language like Go and Rust with an “authoritative” built-in solution.
I wouldn’t mind the 30 different packaging solutions as long as there was authoritative “correct” solution. All the others would then be opt-in enhancements as needed.
I guess a good thought experiment would be if we were to design a packaging system (or decide not to) for a new PL like python, what would it look like?
woodruffw
> I breathe a sigh of relief in language like Go and Rust with an “authoritative” built-in solution.
I don't know about Go, but Rust's packaging isn't authoritative in the sense that I meant. There's no packaging BDFL; improvements to Rust packaging happen through a standards process that closely mirrors that of Python's PEPs.
I think the actual difference between Rust and Python is that Rust made the (IMO correct) decision early on to build a single tool for package management, whereas Python has historically had a single installer and left every other part of package management up to the user. That's a product of the fact that Python is more of a patchwork ecosystem and community than Rust is, plus the fact that it's a lot older and a lot bigger (in terms of breadth of user installation base).
Basically, hindsight is 20/20. Rust rightly benefited from Python's hard lesson about not having one tool, but they also rightly benefited from Python's good experience with consensus-driven standardization.
ctoth
As I said a couple weeks ago, they're gonna have to cash out at some point. The move won't be around Uv -- it'll be a protected private PyPi or something.
https://news.ycombinator.com/item?id=44712558
Now what do we have here?
snooniverse
Not sure what you're trying to get at here. Charlie Marsh has literally said this himself; see e.g. this post he made last September:
> "An example of what this might look like (we may not do this, but it's helpful to have a concrete example of the strategy) would be something like an enterprise-focused private package registry."
https://hachyderm.io/@charliermarsh/113103605702842937
Astral has been very transparent about their business model.
MoreQARespect
Astral doesn't really have a business model yet, it has potential business models.
The issue is that there isn't a clean business model that will produce the kind of profits that will satisfy their VCs - not that there isn't any business model that will help support a business like theirs.
Private package management would probably work fine if they hadn't taken VC money.
IshKebab
I would have agreed with you until I learned that Conda somehow makes $150m in revenue a year. They have fewer users than Astral too (or if not they will do very soon).
eldenring
Cash out is a bit of a negative word here. They've shown the ability to build categorically better tooling, so I'm sure a lot of companies would be happy to pay them to fix even more of their problems.
klysm
It’s not negative, it’s accurate. The playbook is well known and users should be informed.
kinow
I haven't adopted uv yet watching to see what will be their move. We recently had to review our use of Anaconda tools due to their changes, then review Qt changes in license. Not looking forward to another license ordeal.
zanie
We're hoping that building a commercial service makes it clear that we have a sustainable business model and that our tools (like uv) will remain free and permissively licensed.
(I work at Astral)
simonw
I think having a credible, proven business model is a feature of an open source project - without one there are unanswered questions about ongoing maintenance.
I'm glad to see Astral taking steps towards that.
jsmeaton
I've been wondering where the commercial service would come in and this sounds like just the right product that aligns with what you're already doing and serves a real need. Setting up scalable private registries for python is awful.
__mharrison__
You know what they say: The best time to adopt uv was last year...
I'm all seriousness, I'm all in on uv. Better than any competition by a mile. Also makes my training and clients much happier.
lenerdenator
Fortunately for a lot of what uv does, one can simply switch to something else like Poetry. Not exactly a zero-code lift but if you use pyproject.toml, there are other tools.
Of course if you are on one of the edge cases of something only uv does, well... that's more of an issue.
int_19h
Given how widely popular uv is, I'm pretty sure that in the event of any impactful license change it would immediately get forked.
m4r71n
What does GPU-aware mean in terms of a registry? Will `uv` inspect my local GPU spec and decide what the best set of packages would be to pull from Pyx?
Since this is a private, paid-for registry aimed at corporate clients, will there be an option to expose those registries externally as a public instance, but paid for by the company? That is, can I as a vendor pay for a Pyx registry for my own set of packages, and then provide that registry as an entrypoint for my customers?
charliermarsh
> Will `uv` inspect my local GPU spec and decide what the best set of packages would be to pull from Pyx?
We actually support this basic idea today, even without pyx. You can run (e.g.) `uv pip install --torch-backend=auto torch` to automatically install a version of PyTorch based on your machine's GPU from the PyTorch index.
pyx takes that idea and pushes it further. Instead of "just" supporting PyTorch, the registry has a curated index for each supported hardware accelerator, and we populate that index with pre-built artifacts across a wide range of packages, versions, Python versions, PyTorch versions, etc., all with consistent and coherent metadata.
So there are two parts to it: (1) when you point to pyx, it becomes much easier to get the right, pre-built, mutually compatible versions of these things (and faster to install them); and (2) the uv client can point you to the "right" pyx index automatically (that part works regardless of whether you're using pyx, it's just more limited).
> Since this is a private, paid-for registry aimed at corporate clients, will there be an option to expose those registries externally as a public instance, but paid for by the company? That is, can I as a vendor pay for a Pyx registry for my own set of packages, and then provide that registry as an entrypoint for my customers?
We don't support this yet but it's come up a few times with users. If you're interested in it concretely feel free to email me (charlie@).
cgravill
Is there an intention to bring the auto backend selection to the non-pip interface? I know we can configure this like you show https://docs.astral.sh/uv/guides/integration/pytorch/ but we have folks on different accelerators on Linux and remembering ‘uv sync --extra cu128’ at the right time is fragile so we just make cpu folks have the CUDA overhead too currently.
(As always, big fans of Astral’s tools. We should get on with trying pyx more seriously)
scarlehoff
Hi Charlie
what happens in a situation in which I might have access to a login node, from which I can install packages, but then the computing nodes don't have internet access. Can I define in some hardware.toml the target system and install there even if my local system is different?
To be more specific, I'd like to do `uv --dump-system hardware.toml` in the computing node and then in the login node (or my laptop for that matter) just do `uv install my-package --target-system hardware.toml` and get an environment I can just copy over.
zanie
Yes, we let you override our detection of your hardware. Though we haven't implemented dumping detected information on one platform for use on another, it's definitely feasible, e.g., we're exploring a static metadata format as a part of the wheel variant proposal https://github.com/wheelnext/pep_xxx_wheel_variants/issues/4...
zvr
I love curated, consistent, and coherent metadata.
Is the plan to also provide accurate (curated) metadata for security and compliance purposes?
twarge
The real pyx is an absolutely wonderful graphing package. It's like Tex in that everything looks wonderful and publication-quality.
almostgotcaught
there's something about these comments ("name-collision") that drives me up the wall. do y'all realize multiple things can have the same name? for example, did you know there are many people with exactly the same names:
https://www.buzzfeed.com/kristenharris1/famous-people-same-n...
and yet no one bemoans this (hospitals don't consult name registries before filling out birth certificates). that's because it's almost always extremely clear from context.
> The real pyx
what makes that pyx any more "real" than this pyx? it's the extension of the language py plus a single letter. there are probably a thousand projects that could rightfully use that combination of letters as a name.
iepathos
Human naming has nothing to do with software naming which seems obvious but apparently not. Python package creators should check the pypi registry for names and generally avoid name collisions where reasonable. Common sense applies for reduced confusion for users globally and also for potential legal issues if any party trademarks their software name. What makes one pyx more real than the other is one was first and took the spot on pypi. Simple as that. https://pypi.org/project/PyX/
almostgotcaught
> https://pypi.org/project/PyX/
the last release is Oct 16, 2022. are we doing this like jerseys - the name is now retired because pyx won all the championships?
twarge
it's the pyx you get with `pip install pyx`?
Myrmornis
Agreed. I'm the author of a fairly popular dev environment project. Every so often you get people turning up enraged because I chose a name that some other project once used. In the situation I'm talking about it makes even less sense than pip -- it's a command-line executable. There's no one repository (although doesn't seem like Debian people would agree with that!). There's a multitude of package managers on different platforms. Furthermore, in case people hadn't noticed, there are these things called languages, countries, and cultures. There is no reason in general why there might not be package managers whose use is culturally or geographically non-uniform and perhaps entirely unfamiliar to those in other countries. So, what's the plan for checking whether a name is "taken" across human culture, history, and computing platforms? Silly out-dated notion.
rob
Is there a big enough commercial market for private Python package registries to support an entire company and its staff? Looks like they're hiring for $250k engineers, starting a $26k/year OSS fund, etc. Expenses seem a bit high if this is their first project unless they plan on being acquired?
eldenring
Its interesting because the value is definitely there. Every single python developer you meet (many of who are highly paid) has a story about wasting a bunch of time on these things. The question is how much of this value can Astral capture.
I think based on the quality of their work, there's also an important component which is trust. I'd trust and pay for a product from them much more readily than an open source solution with flaky maintainers.
est31
Yeah they certainly generate a lot of value by providing excellent productivity tooling. The question is how they capture some of that value, which is notoriously hard with an OSS license. A non-OSS license creates the adobe trap on the other hand, where companies deploy more and more aggressive moetization strategies, making life worse and worse for users of the software.
zvr
There definitely is a large market for it. Especially if they provide accurate (curated) metadata for security and compliance purposes.
graynk
I would love to have something that replaces JFrog's Artifactory or Sonatype Nexus. Having _just_ a python registry IMO limits the enterprise audience by quite a bit (even if it is much better for Python-specific issues)
int_19h
Continuum has been doing something very similar with Anaconda, and they've been around for over a decade now.
hobofan
From what I can tell they (had to?) ramp up their aggressiveness regarding getting paid though.
They had contacted a company I had worked for asking them to purchase a license, because apparently somewhere in the company some dev workflow had contacted the conda servers regularly. We never ended up tracing it down as it stopped some weeks before them contacting us, according to our network logs.
In general, you don't resort to such sales tactics unless there is good cause (significant usage, where you as unpaid vendor also have leverage) or you just end up burning potential future customers (as was the case here, as they chose a bad contact point inside the company).
physicsguy
And Enthought before them...
dimatura
Just one data point, but if it's as nice to use as their open source tools and not outrageously expensive, I'd be a customer. Current offerings for private python package registries are kind of meh. Always wondered why github doesn't offer this.
thrown-0825
Ask Docker how that worked out.
dakiol
I'm brushing up with Python for a new job, and boy what a ride. Not because of the language itself but the tooling around packages. I'm coming from Go and TS/JS and while these two ecosystems have their own pros and cons, at least they are more or less straightforward to get onboarded (there are 1 or 2 tools you need to know about). In Python there are dozens of tools/concepts related to packaging: pip, easy_install, setuptools, setup.py, pypy, poetry, uv, venv, virtualenv, pipenv, wheels, ... There's even an entire website dedicated to this topic: https://packaging.python.org
Don't understand how a private company like Astral is leading here. Why is that hard for the Python community to come up with a single tool to rule them all? (I know https://xkcd.com/927/). Like, you could even copy what Go or Node are doing, and make it Python-aware; no shame on that. Instead we have these who-knows-how-long-they-will-last tools every now and then.
They should remove the "There should be one-- and preferably only one --obvious way to do it." from the Python Zen.
lenerdenator
> In Python there are dozens of tools/concepts related to packaging: pip, easy_install, setuptools, setup.py, pypy, poetry, uv, venv, virtualenv, pipenv, wheels,
Some of those are package tools, some are dependency managers, some are runtime environments, some are package formats...
Some are obsolete at this point, and others by necessity cover different portions of programming language technologies.
I guess what I'm saying is, for the average software engineer, there's not too many more choices in Python for programming facilities than in Javascript.
turnsout
You're right, it's not like there are actually 14 competing standards, but there are still too many—and that goes for Javascript as well.
zahlman
I don't know why you were downvoted. You are absolutely correct.
beshrkayali
It's not an easy task, and when there's already lots of established practices, habits, and opinions, it becomes even more difficult to get around the various pain points. There's been many attempts: pip (the standard) is slow, lacks dependency resolution, and struggles with reproducible builds. Conda is heavy, slow to solve environments, and mixes Python with non-Python dependencies, which makes understanding some setups very complicated. Poetry improves dependency management but is sluggish and adds unnecessary complexity for simple scripts/projects. Pipenv makes things simpler, but also has the same issue of slow resolution and inconsistent lock files. Those are the ones I've used over the years at least.
uv addressed these flaws with speed, solid dependency resolution, and a simple interface that builds on what people are already used to. It unifies virtual environment and package management, supports reproducible builds, and integrates easily with modern workflows.
IshKebab
I actually think the main problem is that they aren't even willing to admit that there is a problem. It's the classic "our program is great; if users have issues it's because they are using it wrong / didn't read the manual / are stupid".
Go and look up why you can't run scripts with `python3 foo.py` on Windows. It's like a one-line fix and they've come up with all sorts of naysaying reasons rather than just adding python3.exe (which Microsoft did years ago in their Python packages).
notatallshaw
> easy_install
I don't know what guides you're reading but I haven't touched easy_install in at least a decade. It's successor, pip, had effectively replaced all use cases for it by around 2010.
Thrymr
> I don't know what guides you're reading but I haven't touched easy_install in at least a decade.
It is mentioned in the "Explanations and Discussions" section [0] of the linked Python Packaging guide.
Old indeed, but can still be found at the top level of the current docs.
[0] https://packaging.python.org/en/latest/#explanations-and-dis...
biorach
Yes, it is mentioned there, as being deprecated:
> easy_install, now deprecated, was released in 2004 as part of Setuptools.
mixmastamyk
You don't need to know most of those things. Until last year I used setup.py and pip exclusively for twenty years, with a venv for each job at work. Wheels are simply prebuilt .zips. That's about an hour of learning more or less.
Now we have pyproject.toml and uv to learn. This is another hour or so of learning, but well worth it.
Astral is stepping up because no one else did. Guido never cared about packaging and that's why it has been the wild west until now.
thrown-0825
Python gained popularity in academic circles because it was easy, not because it was good.
Its a pain in the ass to work with professionally.
aidenn0
I don't know; I was looking at TS tutorials the other day and there seemed to be at least half a dozen "bundlers" with different tutorails suggesting different ones to use. It took me a while to figure out I could just directly invoke "tsc" to generate javascript from typescript.
IshKebab
Yeah Typescript is maybe not the best example. Go, Rust and Zig get this right though. And Deno, which is based on Typescript.
cyphar
Go only "got this right" after a decade of kicking and screaming about how it wasn't necessary. Dealing with the vendor and go.mod transitions was incredibly painful. There wasn't even a mechanism for hiding packages from everyone in the world from importing until Go 1.4!
I'm still not convinced that the built-in vendor support added in Go 1.5 wasn't intentionally made incompatible with the community-developed solutions out of some weird kind of spite. Why didn't they just use "./vendor/src" like all of the existing tools were using? (Remember that Go was written by plan9 folks, so making a "src" symlink didn't work. In fact, the Go compiler dislikes symlinks in most places -- pre-modules the unholy things you had to do with GOPATH were awful.)
procaryote
Unless the thing you're building is a python library you want other people to install, you don't need to understand all that much of it.
You just need to understand how to setup a venv, and for each thing you want to depend on how to install it in the venv. Put those bits in a shell script and you can clone the project, run the script and then have a working env
It will break sometimes because python and a lot of python packages have zero respect for backward compatibility, but that's python for you.
It's ugly and horrible, but you don't have to relearn the latest python packaging "standard" every other year
I've been burned too many times by embracing open source products like this.
We've been fed promises like these before. They will inevitably get acquired. Years of documentation, issues, and pull requests will be deleted with little-to-no notice. An exclusively commercial replacement will materialize from the new company that is inexplicably missing the features you relied on in the first place.