Skip to content(if available)orjump to list(if available)

Uv's killer feature is making ad-hoc environments easy

nharada

I really like uv, and it's the first package manager for a while where I haven't felt like it's a minor improvement on what I'm using but ultimately something better will come out a year or two later. I'd love if we standardized on it as a community as the de facto default, especially for new folks coming in. I personally now recommend it to nearly everyone, instead of the "welllll I use poetry but pyenv works or you could use conda too"

poincaredisk

I never used anything other than pip. I never felt the need to use anything other than pip (with virtualenv). Am I missing anything?

NeutralCrane

Couple of things.

- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies

- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.

- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.

Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.

You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.

driggs

Yes, generally people already use an additional tool for managing their Python executables, like their operating system's package manager:

  $> sudo apt-get install python3.10 python3.11 python3.12
And then it's simple to create and use version-specific virtual environments:

  $> python3.11 -m venv .venv3.11
  $> source .venv3.11/bin/activate
  $> pip install -r requirements.txt
You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment. In fact, this behavior is made more difficult by tools like `uv` if or `pipx` they're trying to manage Python executables as well as dependencies.

kiddico

Sometimes I feel like my up vote doesn't adequately express my gratitude.

I appreciate how thorough this was.

stavros

Oh wow, it actually can handle the Python executable? I didn't know that, that's great! Although it's in the article as well, it didn't click until you said it, thanks!

PaulHoule

pip's resolving algorithm is not sound. If your Python projects are really simple it seems to work but as your projects get more complex the failure rate creeps up over time. You might

   pip install
something and have it fail and then go back to zero and restart and have it work but at some point that will fail. conda has a correct resolving algorithm but the packages are out of date and add about as many quality problems as they fix.

I worked at a place where the engineering manager was absolutely exasperated with the problems we were having with building and deploying AI/ML software in Python. I had figured out pretty much all the problems after about nine months and had developed a 'wheelhouse' procedure for building our system reliably, but it was too late.

Not long after I sketched out a system that was a lot like uv but it was written in Python and thus had problems with maintaining its own stable Python enivronment (e.g. poetry seems to trash itself every six months or so.)

Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been. (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)

I didn't go forward with developing it because I argued with a lot of people who, like you, thought it was "the perfect being the enemy of the good" when it was really "the incorrect being the enemy of the correct." I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.

MadnessASAP

May I introduce you to our lord and saviour, Nix and it's most holy child nixpkgs! With only a small tithing of your sanity and ability to Interop with any other dependency management you can free yourself of all dependency woes forever!

[] For various broad* definitions of forever.

[*] Like, really, really broad**

[**] Maybe a week if you're lucky

morkalork

Ugh, I hate writing this but that's where docker and microservices comes to the rescue. It's a pain in the butt and inefficient to run but if you don't care about the overhead (and if you do care, why are you still using Python?), it works.

zahlman

> You might `pip install` something and have it fail and then go back to zero and restart and have it work but at some point that will fail.

Can you give a concrete example, starting from a fresh venv, that causes a failure that shouldn't happen?

> but it was written in Python and thus had problems with maintaining its own stable Python enivronment

All it has to do is create an environment for itself upon installation which is compatible with its own code, and be written with the capability of installing into other environments (which basically just requires knowing what version of Python it uses and the appropriate paths - the platform and ABI can be assumed to match the tool, because it's running on the same machine).

This is fundamentally what uv is doing, implicitly, by not needing a Python environment to run.

But it's also what the tool I'm developing, Paper, is going to do explicitly.

What's more, you can simulate it just fine with Pip. Of course, that doesn't solve the issues you had with Pip, but it demonstrates that "maintaining its own stable Python environment" is just not a problem.

>Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been.

From what I can tell, the speed mainly comes from algorithmic issues, caching etc. Pip is just slow above and beyond anything Python forces on it.

An example. On my system, creating a new venv from scratch with Pip included (which loads Pip from within its own vendored wheel, which then runs in order to bootstrap itself into the venv) takes just over 3 seconds. Making a new venv without Pip, then asking a separate copy of Pip to install an already downloaded Pip wheel would be about 1.7 seconds. But making that venv and using the actual internal installation logic of Pip (which has been extracted by Pip developer Pradyun Gedam as https://github.com/pypa/installer ) would take about 0.25 seconds. (There's no command-line API for this; in my test environment I just put the `installer` code side by side with a driver script, which is copied from my development work on Paper.) It presumably could be faster still.

I honestly have no idea what Pip is doing the rest of that time. It only needs to unzip an archive and move some files around and perform trivial edits to others.

> (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)

Pip has had this feature for a long time (and it's still there - I think to support legacy projects without wheels, because I think the JSON API won't be able to provide the data since PyPI doesn't build the source packages). It's why the PyPI server supports range requests in the first place.

> I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.

The community's politics are indeed awful. But Rust (or any other language outside of Python) is not needed to solve the problem.

ppierald

Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore. The dependency management and installation is lightning quick. It deals with all of the python versioning

and I think a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>

Your script code would like:

#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "...", # "..." # ] # ///

Then uv will make a new venv, install the dependencies, and execute the script faster than you think. The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.

It is really interesting. You should at least take a look at a YT or something. I think you will be impressed.

Good luck!

zahlman

>Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore.

I might just blow your mind here:

  $ time python -m venv with-pip

  real 0m3.248s
  user 0m3.016s
  sys 0m0.219s
  $ time python -m venv --without-pip without-pip

  real 0m0.054s
  user 0m0.046s
  sys 0m0.009s
The thing that actually takes time is installing Pip into the venv. I already have local demonstrations that this installation can be an order of magnitude faster in native Python. But it's also completely unnecessary to do that:

  $ source without-pip/bin/activate
  (without-pip) $ ~/.local/bin/pip --python `which python` install package-installation-test
  Collecting package-installation-test
    Using cached package_installation_test-1.0.0-py3-none-any.whl.metadata (3.1 kB)
  Using cached package_installation_test-1.0.0-py3-none-any.whl (3.1 kB)
  Installing collected packages: package-installation-test
  Successfully installed package-installation-test-1.0.0
I have wrappers for this, of course (and I'm explicitly showing the path to a separate Pip that's already on my path for demonstration purposes).

> a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>

Yes, Uv implements PEP 723 "Inline Script Metadata" (https://peps.python.org/pep-0723/) - originally the idea of Paul Moore from the Pip dev team, whose competing PEP 722 lost out (see https://discuss.python.org/t/_/29905). He's been talking about a feature like this for quite a while, although I can't easily find the older discussion. He seems to consider it out of scope for Pip, but it's also available in Pipx as of version 1.4.2 (https://pipx.pypa.io/stable/CHANGELOG/).

> The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.

Part of why Pip is slow at this is because it insists on checking PyPI for newer versions even if it has something cached, and because its internal cache is designed to simulate an Internet connection and go through all the usual metadata parsing etc. instead of just storing the wheels directly. But it's also just slow at actually installing packages when it already has the wheel.

In principle, nothing prevents a Python program from doing caching sensibly and from shuffling symlinks around.

amluto

If you switch to uv, you’ll have fewer excuses to take coffee breaks while waiting for pip to do its thing. :)

mplewis

Pip only has requirements.txt and doesn't have lockfiles, so you can't guarantee that the bugs you're seeing on your system are the same as the bugs on your production system.

aidos

I’ve always worked around that by having a requirements.base.txt and a requirements.txt for the locked versions. Obviously pip doesn’t do that for you but it’s not hard to manage yourself.

Having said that, I’m going to give uv a shot because I hear so many good things about it.

zo1

The requirements.txt file is the lockfile. Anyways, this whole obsession with locked deps or "lockfiles" is such an anti-pattern, I have no idea why we went there as an industry. Probably as a result of some of the newer stuff that is classified as "hipster-tech" such as docker and javascript.

polski-g

"pip freeze" generates a lockfile.

rat87

Pip is sort of broken before because it encourages confusion between requirements and lock files. In other languages with package managers you generally specify your requirements with ranges and get a lock file with exact versions of those and any transitive dependencies letting you easily recreate a known working environment. The only way to do that in pip is to make a *new* venue install then pip freeze. I think pip tools package is supposed to help but it's a separate tool (one which I've also includes). Also putting stuff in pyproject.toml feels more solid then requirements files (and allows options to be set on requirements (like installing only one package that's only on your company's private python package index mirror while installing the others from the global python package index) and allows dev dependencies and other optional features dependency groups without multiple requirements files and having to update locks on those files.

It also automatically creates venvs if you delete them. And it automatically updates packages when you run something with uv run file.py (useful when somebody may have updated the requirements in git). It also lets you install self contained (installed in a virtualenv and linked to ~/.local/bin which is added to your path)python tools (replacing pipx). It installs self contained python builds letting you more easily pick python version and specify it in a .python-version file for your project (replacing pyenv and usually much nicer because pyenv compiles them locally)

Uv also makes it easier to explore and say start a ipython shell with 2 libraries uv run --with ipython --with colorful --with https ipython

It caches downloads. Of course the http itself isn't faster but they're exploring things to speed that part up and since it's written in rust local stuff (like deleting and recreating a venv with cached packages) tends to be blazing fast

mosselman

I am not a python developer, but sometimes I use python projects. This puts me in a position where I need to get stuff working while knowing almost nothing about how python package management works.

Also I don’t recognise errors and I don’t know which python versions generally work well with what.

I’ve had it happen so often with pip that I’d have something setup just fine. Let’s say some stable diffusion ui. Then some other month I want to experiment with something like airbyte. Can’t get it working at all. Then some days later I think, let’s generate an image. Only to find out that with pip installing all sorts of stuff for airbyte, I’ve messed up my stable diffusion install somehow.

Uv clicked right away for me and I don’t have any of these issues.

Was I using pip and asdf incorrectly before? Probably. Was it worth learning how to do it properly in the previous way? Nope. So uv is really great for me.

hyeonwho4

This is not just a pip problem. I had the problem with anaconda a few years ago where upgrading the built in editor (spyder?) pulled versions of packages which broke my ML code, or made dependencies impossible to reconsile. It was a mess, wasting hours of time. Since then I use one pip venv for each project and just never update dependencies.

loxias

My life got a lot easier since I adopted the habit of making a shell script, using buildah and podman, that wrapped every python, rust, or golang project I wanted to dabble with.

It's so simple!

Create a image with the dependencies, then `podman run` it.

cpburns2009

I'm fairly minimalist when it comes to tooling: venv, pip, and pip-tools. I've started to use uv recently because it resolves packages significantly faster than pip/pip-tools. It will generate a "requirements.txt" with 30 packages in a few seconds rather than a minute or two.

tehjoker

What is the deal with uv's ownership policy? I heard it might be VC backed. To my mind, that means killing pip and finding some kind of subscription revenue source which makes me uneasy.

The only way to justify VC money is a plot to take over the ecosystem and then profit off of a dominant position. (e.g. the Uber model)

I've heard a little bit about UV's technical achievements, which are impressive, but technical progress isn't the only metric.

feznyng

It’s dual MIT and Apache licensed. Worst case, if there’s a rug pull, fork it.

tehjoker

Is that the entire story? If so the VCs are pretty dumb. If they kill pip, that means the people who were maintaining it disperse and forking it won't restore the ecosystem that was there before.

benatkin

This:

> I haven't felt like it's a minor improvement on what I'm using

means that this:

> I'd love if we standardized on it as a community as the de facto default

…probably shouldn’t happen. The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

It would be like replacing the python repl with the current version of ipython. I’d say the same thing, that it isn’t a minor improvement. While I almost always use ipython now, I’m glad it’s a separate thing.

lolinder

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

The problem is that in the python ecosystem there really isn't a default de facto standard yet at all. It's supposed to be pip, but enough people dislike pip that it's hard as a newcomer to know if it's actually the standard or not.

The nice thing about putting something like this on a pedestal is that maybe it could actually become a standard, even if the standard should be simple and get out of the way. Better to have a standard that's a bit over the top than no standard at all.

benatkin

It feels even more standard than it used to, with python -m pip and python -m venv making it so it can be used with a virtalenv even if only python or python3 is in your path.

comex

As it happens, the Python REPL was just replaced a few months ago!

…Not with IPython. But with an implementation written in Python instead of C, originating from the PyPy project, that supports fancier features like multi-line editing and syntax highlighting. See PEP 762.

I was apprehensive when I heard about it, but then I had the chance to use it and it was a very nice experience.

benatkin

Ooh, nice. I think recently the times I've used the latest version of python it's been with ipython, so I didn't notice. Going to check it out! It might be easier to make a custom repl now.

greazy

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

This to me is unachievable. Perfection is impossible. On the the way there if the community and developers coalesced around a single tool then maybe we can start heading down the road to perfectionism.

benatkin

I mean, stays out of the way for simple uses.

When I first learned Python, typing python and seeing >>> and having it evaluate what I typed as if it appeared in a file was a good experience.

Now that I use python a lot, ipython is more out of the way to me than the built-in python repl is, because it lets me focus on what I'm working on, than limitations of a tool.

mrbonner

+1 uv now also supports system installation of python with the --default --preview flags. This probably allows me to replace mise (rtx) and go uv full time for python development. With other languages, I go back to mise.

tomtom1337

I use mise with uv for automatic activation of venvs when I cd into a directory containing one (alongside a mise.toml). Do you tackle this in some other manner?

mrbonner

Did you mean using mise to install uv? I never thought of this before. Do you have a reference somewhere of how it works? Thx

null

[deleted]

zahlman

(I will likely base a blog post in my packaging series off this comment later.)

What people seem to miss about Pip is that it's by design, not a package manager. It's a package installer, only. Of course it doesn't handle the environment setup for you; it's not intended for that. And of course it doesn't keep track of what you've installed, or make lock files, or update your `pyproject.toml`, or...

What it does do is offer a hideously complex set of options for installing everything under the sun, from everywhere under the sun. (And that complexity has led to long-standing, seemingly unfixable issues, and there are a lot of design decisions made that I think are questionable at best.)

Ideas like "welllll I use poetry but pyenv works or you could use conda too" are incoherent. They're for different purposes and different users, with varying bits of overlap. The reason people are unsatisfied is because any given tool might be missing one of the specific things they want, unless it's really all-in-one like Uv seems like it intends to be eventually.

But once you have a truly all-in-one tool, you notice how little of it you're using, and how big it is, and how useless it feels to have to put the tool name at the start of every command, and all the specific little things you don't like about its implementation of whatever individual parts. Not to mention the feeling of "vendor lock-in". Never mind that I didn't pay money for it; I still don't want to feel stuck with, say, your build back-end just because I'm using your lock-file updater.

In short, I don't want a "package manager".

I want a solid foundation (better than Pip) that handles installing (not managing) applications and packages, into either a specified virtual environment or a new one created (not managed, except to make it easy to determine the location, so other tools can manage it) for the purpose. In other words, something that fully covers the needs of users (making it possible to run the code), while providing only the bare minimum on top of that for developers - so that other developer tools can cooperate with that. And then I want specialized tools for all the individual things developers need to do.

The specialized tools I want for my own work all exist, and the tools others want mostly exist too. Twine uploads stuff to PyPI; `build` is a fine build front-end; Setuptools would do everything I need on the back-end (despite my annoyances with it). I don't need a lockfile-driven workflow and don't readily think in those terms. I use pytest from the command line for testing and I don't want a "package manager" nor "workflow tool" to wrap that for me. If I needed any wrapping there I could do a few lines of shell script myself. If anything, the problem with these tools is doing too much, rather than too little.

The base I want doesn't exist yet, so I've started making it myself. Pipx is a big step in the right direction, but it has some arbitrary limitations (I discuss these and some workarounds in my recent blog post https://zahlman.github.io/posts/2025/01/07/python-packaging-... ) and it's built on Pip so it inherits those faults and is that much bigger. Uv is even bigger still for the compiled binary, and I would only be using the installation parts.

imtringued

Virtualenv should have never existed in the first place. So you claiming that UV or whatever tool is doing too much, sounds to me like you're arguing based on "traditionalist" or "conservative" reasons rather than doing any technical thinking here.

Node.js's replacement for virtualenv is literally just a folder named "node_modules". Meanwhile python has an entire tool with strange ideosyncracies that you have to pay attention to otherwise pip does the wrong thing by default.

It is as if python is pretending to be a special snowflake where installing libraries into a folder is this super hyper mega overcomplicated thing that necessitates a whole dedicated tool just to manage, when in reality in other programming languages nobody is really thinking about that the fact that the libraries end up in their build folders. It just works.

So again you're pretending that this is such a big deal that it needs a whole other tool, when the problem in question is so trivial that another tool is adding mental overhead with regard to the microscopic problem at hand.

dragonwriter

> Node.js's replacement for virtualenv is literally just a folder named "node_modules".

Node_modules doesn't support an isolated node interpreter distinct from what may be installed elsewhere on the machine. Third party tools are available that do that for node, but node_modules alone addresses a subset of the issues that venvs solve.

OTOH, its arguably undesirable that there isn't a convenient way in Python to do just what node_modules does without the extra stuff that venvs do, because there are a lot of use cases where that kind of solution would be sufficient and lower overhead.

zahlman

Uv doing things I'm not interested in, has absolutely nothing to do with the design of virtualenvs. But virtualenvs are easy enough to work with; they absolutely don't "necessitate a whole dedicated tool just to manage" unless you could the `activate` script that they come with.

But also, Uv uses them anyway. Because that's the standard. And if the Python standard were to use a folder like node_modules, Uv would follow suit, and so would I. And Uv would still be doing package management and workflow tasks that I'm completely uninterested in, and presenting an "every command is prefixed with the tool suite name* UI that I don't like.

There was a proposal for Python to use a folder much like node_modules: see https://discuss.python.org/t/pep-582-python-local-packages-d... . It went through years of discussion and many problems with the idea were uncovered. One big issue is that installing a Python package can install more than just the importable code; the other stuff has to be put somewhere sensible.

>So again you're pretending that this is such a big deal that it needs a whole other tool

I make no such claim. The tool I want is not for managing virtual environments. It's for replacing Pip, and offering a modicum of convenience on top of that. Any time you install a package, no matter whether you use Pip or uv or anything else, you have to choose (even if implicitly) where it will be installed. Might as well offer the option of setting up a new location, as long as we have a system where setup is necessary.

botanical76

Do you feel that Npm, mix, cargo went the wrong way, doing too much? It seems like their respective communities _love_ the standard tooling and all that it does. Or is Python fundamentally different?

zahlman

Python is fundamentally used in different ways, and in particular is much more often used in conjunction with code in other languages that has to be compiled separately and specially interfaced with. It also has a longer history, and a userbase that's interested in very different ways of using Python than the "development ecosystem" model where you explicitly create a project with the aim of adding it to the same package store where you got your dependencies from. Different users have wildly different needs, and tons of the things workflow tools like uv/Poetry/PDM/Hatch/Flit do are completely irrelevant to lots of them. Tons of users don't want to "manage dependencies"; they want to have a single environment containing every package that any of their projects ever uses (and a shoulder to cry on if a version conflict ever arises from that). Tons of users don't want to make a "project" and they especially don't want to set up their Python code as a separate, reusable thing, isolated from the data they're working on with it right now. Tons of users think they know better than some silly "Pip" tool about exactly where each of their precious .py files ought to go on your hard drive. Tons of developers want their program to look and feel like a stand-alone, independent application, that puts itself in C:\Program Files and doesn't expect users to know what Python is. People more imaginative than I could probably continue this train of thought for quite a bit.

For many of the individual tasks, there are small Unix-philosophy tools that work great. Why is typing `poetry publish` better than typing `twine upload`? (And it would be just `twine`, except that there's a `register` command that PyPI doesn't need in the first place, and a `check` command that's only to make sure that other tools did their job properly - and PyPI will do server-side checks anyway.) Why is typing `poetry run ...` better than activating the venv with the script it provided itself, and then doing the `...` part normally?

An all-in-one tool works when people agree on the scope of "all", and you can therefore offer just one of them and not get complaints.

riwsky

Heck, you can get even cleaner than that by using uv’s support for PEP 723’s inline script dependencies:

  # /// script
  # requires-python = ">=3.12"
  # dependencies = [
  #     "pandas",
  # ]
  # ///
h/t https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

sbene970

I've started a repo with some of these scripts, the most recent one being my favorite: a wrapper for Microsoft AutoGen's very recent Magentic-1, a generalist LLM-Multi-Agent-System. It can use the python code, the CLI, a browser (Playwright) and the file system to complete tasks.

A simple example a came across is having to rename some files:

1. you just open the shell in the location you want

2. and run this command:

uv run https://raw.githubusercontent.com/SimonB97/MOS/main/AITaskRu... "check the contents of the .md files in the working dir and structure them in folders"

There's a link to Magentic-1 docs and further info in the repo: https://github.com/SimonB97/MOS/tree/main/AITaskRunner (plus two other simple scripts).

aeurielesn

I don't understand how things like this get approved into PEPs.

Karupan

Seems like a great way to write self documenting code which can be optionally used by your python runtime.

zanie

As in, you think this shouldn't be possible or you think it should be written differently?

epistasis

The PEP page is really good at explaining the status of the proposal, a summary of the discussion to date, and then links to the actual detailed discussion (in Discourse) about it:

https://peps.python.org/pep-0723/

noitpmeder

I see this was accepted (I think?); is the implementation available in a released python version? I don't see an "as of" version on the pep page, nor do lite google searches reveal any official python docs of the feature.

rtpg

It's helpful as a way to publish minimal reproductions of bugs and issues in bug reports (compared to "please clone my repo" which has so many layers of friction involved).

I would want distributed projects to do things properly, but as a way to shorthand a lot of futzing about? It's excellent

null

[deleted]

misiek08

And people were laughing at PHP comments configuring framework, right?

throwup238

Python was always the late born twin brother of PHP with better hair and teeth, but the same eyes that peered straight into the depths of the abyss.

blibble

with that expected use case of uv script run command it effectively makes those comments executable

python's wheels are falling off at an ever faster and faster rate

Spivak

Because of a feature that solves the problem of one off scripts being difficult the moment you need a 3rd party library?

A more Pythonic way of doing this might be __pyproject__ bit that has the tiiiiny snag of needing to execute the file to figure out its deps. I would have loved if __name__ == "pyproject" but while neat and tidy it is admittedly super confusing for beginners, has a "react hooks" style gotcha where to can't use any deps in that block, and you can't use top level imports. The comment was really the easiest way.

linsomniac

I don't think this IS a PEP, I believe it is simply something the uv tool supports and as far as Python is concerned it is just a comment.

zanie

No, this is a language standard now (see PEP 723)

shlomo_z

Is it possible for my IDE (vscode) to support this? Currently my IDE screams at me for using unknown packages and I have no type hinting, intellisense, etc.

Zizizizz

With your python plugin you should be able choose .venv/bin/python as your interpreter after you've run `uv sync` and everything should resolve

randomlurking

So it doesnt work wirk adhoc venvs, does it? Might still valuable for simply running scripts, but I’m not sure venv-less works for dev

shlomo_z

But if I am using inline metadata to declare the dependencies, uv doesn't tell me where the venv is. And there is no `uv sync` command for single scripts as far as I can tell.

null

[deleted]

epistasis

And for the Jupyter setting, check out Trevor Manz's juv:

https://github.com/manzt/juv

__MatrixMan__

So it's like a shebang for dependencies. Cool.

stevage

As a NodeJS developer it's still kind of shocking to me that Python still hasn't resolved this mess. Node isn't perfect, and dealing with different versions of Node is annoying, but at least there's none of this "worry about modifying global environment" stuff.

mdaniel

Caveat: I'm a node outsider, only forced to interact with it

But there are a shocking number of install instructions that offer $(npm i -g) and if one is using Homebrew or nvm or a similar "user writable" node distribution, it won't prompt for sudo password and will cheerfully mangle the "origin" node_modules

So, it's the same story as with python: yes, but only if the user is disciplined

Now ruby drives me fucking bananas because it doesn't seem to have either concept: virtualenvs nor ./ruby_modules

MrJohz

It's worth noting that Node allows two packages to have the same dependency at different versions, which means that `npm i -g` is typically a lot safer than a global `pip install`, because each package will essentially create its own dependency tree, isolated from other packages. In practice, NPM has a deduplication process that makes this more complicated, and so you can run into issues (although I believe other package managers can handle this better), but I rarely run into issues with this.

That said, I agree that `npm i -g` is a poor system package manager, and you should typically be using Homebrew or whatever package manager makes the most sense on your system. That said, `npx` is a good alternative if you just want to run a command quickly to try it out or something like that.

zahlman

>It's worth noting that Node allows two packages to have the same dependency at different versions

Yes. It does this because JavaScript enables it - the default import syntax uses a file path.

Python's default import syntax uses symbolic names. That allows you to do fun things like split a package across the filesystem, import from a zip file, and write custom importers, but doesn't offer a clean way to specify the version you want to import. So Pip doesn't try to install multiple versions either (which saves it the hassle of trying to namespace them). You could set up a system to make it work, but it'd be difficult and incredibly ugly.

Some other language ecosystems don't have this problem because the import is resolved at compile time instead.

fny

Because you don’t need virtualenvs or ruby_modules. You can have however many versions of the same gem installed it’s simply referenced by a gemfile, so for Ruby version X you are guaranteed one copy of gem version Y and no duplicates.

This whole installing the same dependencies a million times across different projects in Python and Node land is completely insane to me. Ruby has had the only sane package manager for years. Cargo too, but only because they copied Ruby.

Node has littered my computer with useless files. Python’s venv eat up a lot of space unnecessarily too.

zahlman

In principle, venvs could hard-link the files from a common source, as long as the filesystem supports that. I'm planning to experiment with this for Paper. It's also possible to use .pth files (https://docs.python.org/3/library/site.html) to add additional folders to the current environment at startup. (I've heard some whispers that this causes a performance hit, but I haven't noticed. Python module imports are cached anyway.) Symlinks should work, too. (But I'm pretty sure Windows shortcuts would not. No idea about junctions.)

regularfry

"They copied ruby" is a little unfair. From memory it was some of the same individuals.

andrewmcdonough

Ruby has a number of solutions for this - rvm (the oldest, but less popular these days), rbenv (probably the most popular), chruby/gem_home (lightweight) or asdf (my personal choice as I can use the same tool for lots of languages). All of those tools install to locations that shouldn't need root.

mdaniel

Yes, I am aware of all of those, although I couldn't offhand tell anyone the difference in tradeoffs between them. But I consider having to install a fresh copy of the whole distribution a grave antipattern. I'm aware that nvm and pyenv default to it and I don't like that

I did notice how Homebrew sets env GEM_HOME=<Cellar>/libexec GEM_PATH=<Cellar>/libexec (e.g. <https://github.com/Homebrew/homebrew-core/blob/9f056db169d5f...>) but, similar to my node experience, since I am a ruby outsider I don't totally grok what isolation that provides

regularfry

The mainline ruby doesn't but tools to support virtualenvs are around. They're pretty trivial to write: https://github.com/regularfry/gemsh/blob/master/bin/gemsh

As long as you're in the ruby-install/chruby ecosystem and managed to avoid the RVM mess then the tooling is so simple that it doesn't really get any attention. I've worked exclusively with virtualenvs in ruby for years.

8n4vidtmkvmk

FWIW, you can usually just drop the `-g` and it'll install into `node_modules/.bin` instead, so it stays local to your project. You can run it straight out of there (by typing the path) or do `npm run <pkg>` which I think temporarily modifies $PATH to make it work.

nicoburns

The `npx` command (which comes bundled with any nodejs install) is the way to do that these days.

PaulHoule

Python has been cleaning up a number of really lethal problems like:

(i) wrongly configured character encodings (suppose you incorporated somebody else's library that does a "print" and the input data contains some invalid characters that wind up getting printed; that "print" could crash a model trainer script that runs for three days if error handling is set wrong and you couldn't change it when the script was running, at most you could make the script start another python with different command line arguments)

(ii) site-packages; all your data scientist has to do is

   pip install --user
the wrong package and they'd trashed all of their virtualenvs, all of their condas, etc. Over time the defaults have changed so pythons aren't looking into the site-packages directories but I wasted a long time figuring out why a team of data scientists couldn't get anything to work reliably

(iii) "python" built into Linux by default. People expected Python to "just work" but it doesn't "just work" when people start installing stuff with pip because you might be working on one thing that needs one package and another thing that needs another package and you could trash everything you're doing with python in the process of trying to fix it.

Unfortunately python has attracted a lot of sloppy programmers who think virtualenv is too much work and that it's totally normal for everything to be broken all the time. The average data scientist doesn't get excited when it crumbles and breaks, but you can't just call up some flakes to fix it. [1]

[1] https://www.youtube.com/watch?v=tiQPkfS2Q84

worik

> Python has been cleaning up a number of really lethal problems like

I wish they would stick to semantic versioning tho.

I have used two projects that got stuck in incompatible changes in the 3.x Python.

That is a fatal problem for Python. If a change in a minor version makes things stop working, it is very hard to recommend the system. A lot of work has gone down the drain, by this Python user, trying to work around that

meitham

I assume these breaking changes are in the stdlib and not in the python interpreter (the language), right?

There was previous discussions about uncoupling the stdlib (python libraries) from the release and have them being released independently, but I can’t remember why that died off

whatever1

I don’t exactly remember the situation but a user created a python module named error.py.

Then in their main code they imported the said error.py but unfortunately numpy library also has an error.py. So the user was getting very funky behavior.

zahlman

Yep, happens all the time with the standard library. Nowadays, third-party libraries have this issue much less because they can use relative imports except for their dependencies. But the Python standard library, for historical reasons, isn't organized into a package, so it can't do that.

Here's a fun one (this was improved in 3.11, but some other names like `traceback.py` can still reproduce a similar problem):

  /tmp$ touch token.py
  /tmp$ py3.10
  Python 3.10.14 (main, Jun 24 2024, 03:37:47) [GCC 11.4.0] on linux
  Type "help", "copyright", "credits" or "license" for more information.
  >>> help()
  Traceback (most recent call last):
    File "<stdin>", line 1, in <module>
    File "/opt/python/standard/lib/python3.10/_sitebuiltins.py", line 102, in __call__
      import pydoc
    File "/opt/python/standard/lib/python3.10/pydoc.py", line 62, in <module>
      import inspect
    File "/opt/python/standard/lib/python3.10/inspect.py", line 43, in <module>
      import linecache
    File "/opt/python/standard/lib/python3.10/linecache.py", line 11, in <module>
      import tokenize
    File "/opt/python/standard/lib/python3.10/tokenize.py", line 36, in <module>
      from token import EXACT_TOKEN_TYPES
  ImportError: cannot import name 'EXACT_TOKEN_TYPES' from 'token' (/tmp/token.py)
Related Stack Overflow Q&A (featuring an answer from me): https://stackoverflow.com/questions/36250353

PaulHoule

... it's tricky. In Java there's a cultural expectation that you name a package like

  package organization.dns.name.this.and.that;
but real scalability in a module system requires that somebody else packages things up as

  package this.and.that;
and you can make the system look at a particular wheel/jar/whatever and make it visible with a prefix you specify like

  package their.this.and.that;
Programmers seem to hate rigorous namespace systems though. My first year programming Java (before JDK 1.0) the web site that properly documented how to use Java packages was at NASA and you still had people writing Java classes that were in the default package.

hauntsaninja

Yeah, that's not exactly the problematic situation... but the good news is I improved the Python's error message for this in 3.13. See https://docs.python.org/3/whatsnew/3.13.html#improved-error-...

zo1

Half the time something breaks in a javascript repo or project, every single damn javascript expert in the team/company tells me to troubleshoot using the below sequence, as if throwing spaghetti on a wall with no idea what's wrong.

Run npm install

Delete node_modules and wait 30minutes because it takes forever to delete 500MB worth of 2 million files.

Do an npm install again (or yarn install or that third one that popped up recently?)

Uninstall/Upgrade npm (or is it Node? No wait, npx I think. Oh well, used to be node + npm, now it's something different.)

Then do steps 1 to 3 again, just in case.

Hmm, maybe it's the lockfile? Delete it, one of the juniors pushed their version to the repo without compiling maybe. (Someone forgot to add it to the gitignore file?)

Okay, do steps 1 to 3 again, that might have fixed it.

If you've gotten here, you are royally screwed and should try the next javascript expert, he might have seen your error before.

So no, I'm a bit snarky here, but the JS ecosystem is a clustermess of chaos and should rather fix it's own stuff first. I have none of the above issues with python, a proper IDE and out of the box pip.

wruza

So you’re not experiencing exactly this with pip/etc? I hit this “just rebuild this 10GB venv” scenario like twice a day while learning ML. Maybe it’s just ML, but then regular node projects don’t have complex build-step / version-clash deps either.

zo1

I think it's something unique to python's ML ecosystem, to be honest. There is a lot of up-in-the-air about how to handle models, binaries and all of that in a contained package, and that results in quite a few hand-rolled solutions, some of which encroach on the package manager's territory, plus of course drivers and windows.

I've worked on/with some fairly large/complex python projects, and they almost never have any packaging issues that aren't just obvious errors by users. Yes, every once in a while we have to be explicit about a dependency because some dependent project isn't very strict with their versioning policy and their API layers.

wink

I've not used python professionally for years - but I have had to do this maybe once in many years of usage. Seen it like once more in my team(s). A rounding error.

I've seen someone having to do this in node like once every month, no matter which year, no matter which project or company.

cdaringe

The pain is real. Most of the issues are navigable, but often take careful thought versus some canned recipe. npm or yarn in large projects can be a nightmare. starting with pnpm makes it a dream. Sometimes migrating to pnpm can be rough, because projects that work may rely on incorrect, transitive, undeclared deps actually resolving. Anyway, starting from pnpm generally resolves this sort of chaos.

Most packing managers are developed.

Pnpm is engineered.

It’s one of the few projects I donate to on GitHub

wiseowise

What kind of amateurs are you working with? I’m not a Node.js dev and even I know about npm ci command.

mirekrusin

Sounds like a tale from a decade ago, people now use things like pnpm and tsx.

fastball

Environment and dependency management in JS-land is even worse.

Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).

More package managers and interactions (corepack, npm, pnpm, yarn, bun).

Bad package interop (ESM vs CJS vs UMD).

More runtimes (Node, Deno, Bun, Edge).

Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.

nosefurhairdo

Valid criticisms, but the "standard" choices all work well. Nvm is the de facto standard for node version management, npm is a totally satisfactory package manager, node is the standard runtime that those other runtimes try to be compatible with, etc.

Will also note that in my years of js experience I've hardly ever run into module incompatibilities. It's definitely gnarly when it happens, but wouldn't consider this to be the same category of problem as the confusion of setting up python.

Hopefully uv can convince me that python's environment/dependency management can be easier than JavaScript's. Currently they both feel bad in their own way, and I likely prefer js out of familiarity.

stevage

> I've hardly ever run into module incompatibilities

I'm not totally sure what you're referring to, but I've definitely had a number of issues along the lines of:

- I have to use import, not require, because of some constraint of the project I'm working in - the module I'm importing absolutely needs to be required, not imported

I really don't have any kind of understanding of what the fundamental issues are, just a very painful transition point from the pre-ESM world to post.

stevage

>Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).

In practice I find this a nuisance but a small one. I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.

> More package managers and interactions (corepack, npm, pnpm, yarn, bun).

But they all work on the same package.json and node_modules/ principle, afaik. In funky situations, incompatibilities might emerge, but they are interchangeable for the average user. (Well, I don't know about corepack.)

> Bad package interop (ESM vs CJS vs UMD).

That is a whole separate disaster, which doesn't really impact consuming packages. But it does make packaging them pretty nasty.

> More runtimes (Node, Deno, Bun, Edge).

I don't know what Edge is. Deno is different enough to not really be in the same game. I find it hard to see the existence of Bun as problematic: it has been a bit of a godsend for me, it has an amazing ability to "just work" and punch through Typescript configuration issues that choke TypeScript. And it's fast.

> Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.

I guess I don't have a lot of reference points for this one. The 1000s of dependencies is certainly true though.

horsawlarway

> I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.

For what it's worth, I think .tool-versions is slowly starting to creep into this space.

Mise (https://mise.jdx.dev/dev-tools/) and ASDF (https://asdf-vm.com/) both support it.

Big reason I prefer ASDF to nvm/rvm/etc right now is that it just automatically adjusts my versions when I cd into a project directory.

RobinL

I've only recently started with uv, but this is one thing it seems to solve nicely. I've tried to get into the mindset of only using uv for python stuff - and hence I haven't installed python using homebrew, only uv.

You basically need to just remember to never call python directly. Instead use uv run and uv pip install. That ensures you're always using the uv installed python and/or a venv.

Python based tools where you may want a global install (say ruff) can be installed using uv tool

rented_mule

> Python based tools where you may want a global install (say ruff) can be installed using uv tool

uv itself is the only Python tool I install globally now, and it's a self-contained binary that doesn't rely on Python. ruff is also self-contained, but I install tools like ruff (and Python itself) into each project's virtual environment using uv. This has nice benefits. For example, automated tests that include linting with ruff do not suddenly fail because the system-wide ruff was updated to a version that changes rules (or different versions are on different machines). Version pinning gets applied to tooling just as it does to packages. I can then upgrade tools when I know it's a good time to deal with potentially breaking changes. And one project doesn't hold back the rest. Once things are working, they work on all machines that use the same project repo.

If I want to use Python based tools outside of projects, I now do little shell scripts. For example, my /usr/local/bin/wormhole looks like this:

  #!/bin/sh
  uvx \
      --quiet \
      --prerelease disallow \
      --python-preference only-managed \
      --from magic-wormhole \
      wormhole "$@"

zahlman

>You basically need to just remember to never call python directly. Instead use uv run and uv pip install.

I don't understand why people would rather do this part specfically, rather than activate a venv.

TZubiri

Because node.js isn't a dependency of the Operating system.

Also we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue.

wruza

Oh, tell us OS can’t venv itself a separate python root and keep itself away from what user invents to manage deps. This is non-explanation appealing to authority while it’s clearly just a mess lacking any thought. It just works like this.

we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue

TensorFlow.

TZubiri

We have virtual envs and package isolation, it's usually bloated, and third party and doesn't make for a good robust OS base, it's more an app layer. See flatpak, snapcraft.

"Compares left pad with ml library backing the hottest AI companies of the cycle"

zahlman

>Oh, tell us OS can’t venv itself a separate python root and keep itself away from what user invents to manage deps.

I've had this thought too. But it's not without its downsides. Users would have to know about and activate that venv in order to, say, play with system-provided GTK bindings. And it doesn't solve the problem that the user may manage dependencies for more than one project, that don't play nice with each other. If everything has its own venv, then what is the "real" environment even for?

sgarland

This. IME, JS devs rarely have much experience with an OS, let alone Linux, and forget that Python literally runs parts of the OS. You can’t just break it, because people might have critical scripts that depend on the current behavior.

__MatrixMan__

I think it makes sense given that people using python to write applications are a minority of python users. It's mostly students, scientists, people with the word "analyst" in their title, etc. Perhaps this goes poorly in practice, but these users ostensibly have somebody else to lean on re: setting up their environments, and those people aren't developers either.

I have to imagine that the python maintainers listen for what the community needs and hear a thousand voices asking for a hundred different packaging strategies, and a million voices asking for the same language features. I can forgive them for prioritizing things the way they have.

stevage

I'm not sure I understand your point. Managing dependencies is easy in node. It seems to be harder in Python. What priority is being supported here?

cdaringe

Hot take: pnpm is the best dx, of all p/l dep toolchains, for devs who are operating regularly in many projects.

Get me the deps this project needs, get them fast, then them correctly, all with minimum hoops.

Cargo and deno toolchains are pretty good too.

Opam, gleam, mvn/gradle, stack, npm/yarn, nix even, pip/poetry/whatever-python-malarkey, go, composer, …what other stuff have i used in the past 12 months… c/c++ doesn’t really have a first class std other than global sys deps (so ill refer back to nix or os package managers).

Getting the stuff you need where you need it is always doable. Some toolchains are just above and beyond, batteries included, ready for productivity.

krashidov

Have you used bun? It's also great. Super fast

theogravity

pnpm is the best for monorepos. I've tried yarn workspaces and npm's idea of it and nothing comes close to the DX of pnpm

paularmstrong

What actually, as an end user, about pnpm is better than Yarn? I've never found an advantage with pnpm in all the times I've tried it. They seem very 1:1 to me, but Yarn edges it out thanks to it having a plugin system and its ability to automatically pull `@types/` packages when needed.

mdaniel

I swear I'm not trolling: what do you not like about modern golang's dep management (e.g. go.mod and go.sum)?

I agree that the old days of "there are 15 dep managers, good luck" was high chaos. And those who do cutesy shit like using "replace" in their go.mod[1] is sus but as far as dx $(go get) that caches by default in $XDG_CACHE_DIR and uses $GOPROXY I think is great

1: https://github.com/opentofu/opentofu/blob/v1.9.0/go.mod#L271

geethree

To be fair your specific example is due to… well forking terraform due to hashicorp licensing changes.

bityard

I usually stay away far FAR from shiny new tools but I've been experimenting with uv and I really like it. I'm a bit bummed that it's not written in Python but other than that, it does what it says on the tin.

I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.

So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.

I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.

meitham

uv is great, but downloading and installing base python interpreter is not a good feature as it doesn’t fetch that from PSF but from a project on GitHub, that very same project says this is compiled for portability over performance, see https://gregoryszorc.com/docs/python-build-standalone/main/

zahlman

>that very same project says this is compiled for portability over performance

Realistically, the options on Linux are the uv way, the pyenv way (download and compile on demand, making sure users have compile-time dependencies installed as part of installing your tool), and letting users download and compile it themself (which is actually very easy for Python, at least on my distro). Compiling Python is not especially fast (around a full minute on my 4-core, 10-year-old machine), although I've experienced much worse in my lifetime. Maybe you can get alternate python versions directly from your distro or a PPA, but not in a way that a cross-distro tool can feasibly automate.

On Windows the only realistic option is the official installer.

globular-toast

Yes, which is why it's silly to do it. Developers (not "users"!) need to learn how to install Python on their system. I honestly don't know how someone can call themselves a Python developer if they can't even install the interpreter!

bityard

But PSF doesn't distribute binary builds, so what's the alternative?

meitham

it does of course here https://www.python.org/downloads/ or are you referring to linux?

brainzap

that it does it automatically is weird

meitham

I've only noticed after my corporate firewall stopped me!

emiller88

There's so many more!

1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang

FergusArgyll

Yes! since that Simon Willison article, I've slowly been easing all my scripts into just using a uv shebang, and it rocks! I've deleted all sorts of .venvs and whatnot. really useful

throwup238

The uv shebang is definitely the killer feature for me, especially with so much of the AI ecosystem tied up in Python. Before, writing Python scripts was a lot more painful requiring either a global scripts venv and shell scripts to bootstrap them, or a venv per script.

I’m sure it was already possible with shebangs and venv before, but uv really brings the whole experience together for me so I can write python scripts as freely as bash ones.

dingdingdang

Super neat re Willison article.. would something like this work under powershell though?!

supakeen

The activation of the virtualenv is unnecessary (one can execute pip/python directly from it), and the configuring of your local pyenv interpreter is also unnecessary, it can create a virtual environment with one directly:

  pyenv virtualenv python3.12 .venv
  .venv/bin/python -m pip install pandas
  .venv/bin/python
Not quite one command, but a bit more streamlined; I guess.

BeeOnRope

Note that in general calling the venv python directly vs activating the venv are not equivalent.

E.g. if the thing you run invokes python itself, it will use the system python, not the venv one in the first case.

meitham

Surely if you want to invoke python you call sys.executable otherwise if your subprocess doesn’t inherit PATH nothing will work with uv or without uv

BeeOnRope

I don't think that's "sure" at all. For one thing, only Python code directly calling Python has that option in the first place, often there is another layer of indirection, e.g., Python code which executes a shell script, which itself invokes Python, etc.

IME it is common to see a process tree with multiple invocations of Python in a ancestor relationship with other processes in between.

zahlman

In rare cases, programs might also care about the VIRTUAL_ENV environment variable set by the activate script, and activation may also temporarily clear out any existing PYTHONHOME (a rarely used override for the location of the standard library). But yes, in general you can just run the executable directly.

astronautas

Indeed, you're right ;).

egeres

This is super cool, personally:

uv run --python 3.12 --with label-studio label-studio

Made my life so much easier

lukax

Uv also bundles uvx command so you can run Python scripts without installing them manually:

uvx --from 'huggingface_hub[cli]' huggingface-cli

oulipo

And there's also the `uv run script.py` where you can have dependencies indicated as comments in the script, see eg https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

krick

Ok, this must be a dumb question answered by the manual, but I still haven't got my hands on uv, so: but does it solve the opposite? I mean, I pretty much never want any "ad-hoc" environments, but I always end up with my .venv becoming an ad-hoc environment, because I install stuff while experimenting, not bothering to patch requirements.txt, pyproject.toml or anything of the sort. In fact, now I usually don't even bother typing pip install, PyCharm does it for me.

This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?

mk12345

I think it does! uv add [0] adds a dependency to your pyproject.toml, as well as your environment.

If you change your pyproject.toml file manually, uv sync [1] will update your environment accordingly.

[0]: https://docs.astral.sh/uv/guides/projects/#managing-dependen... [1]: https://docs.astral.sh/uv/reference/cli/#uv-sync

krick

If I read [1] correctly, it seems it checks against lockfile, not pyproject.toml. So it seems like it won't help if I change pyproject.toml manually. Which is a big inconveniece, if so.

Whatever, I think I'll try it for myself later today. It's long overdue.

hobofan

Most uv commands will (unless otherwise instructed like e.g. with --frozen) by default update your lockfile to be in sync with your pyproject.toml.

zahlman

>installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv)

pyproject.toml represents an inter-project standard and Charlie Marsh has committed to sticking with it, along with cooperating with future Python packaging PEPs. But while you can list transitive dependencies, specify exact versions etc. in pyproject.toml, it's not specifically designed as a lockfile - i.e., pyproject.toml is meant for abstract dependencies, where an installer figures out transitively what's needed to support them and decides on exact versions to install.

The current work for specifying a lockfile standard is https://peps.python.org/pep-0751/ . As someone else pointed out, uv currently already uses a proprietary lockfile, but there has been community interest in trying to standardize this - it just has been hard to find agreement on exactly what it needs to contain. (In the past there have been proposals to expand the `pyproject.toml` spec to include other information that lockfiles often contain for other languages, such as hashes and supply-chain information. Some people are extremely against this, however.)

As far as I know, uv isn't going to do things like analyzing your codebase to determine that you no longer need a certain dependency that's currently in your environment and remove it (from the environment, lock file or `pyproject.toml`). You'll still be on the hook for figuring out abstractly what your project needs, and this is important if you want to share your code with others.

krick

> uv isn't going to do things like analyzing your codebase

Sure, that's not what I meant (unless we call pyproject.toml a part of your codebase, which it kinda is, but that's probably not what you meant).

In fact, as far as I can tell from your answer, Python does move in the direction I'd like it to move, but it's unclear by how far it will miss and if how uv handles it is ergonomical.

As I've said, I think PHP's composer does a very good job here, and to clarify, this is how it works. There are 2 files: composer.json (≈pyproject.toml) and composer.lock (≈ PEP751) (also json). The former is kinda editable by hand, the latter you ideally never really touch. However, for the most part composer is smart enough that it edits both files for you (with some exceptions, of course), so every time I run `composer require your/awesomelib` it

1) checks the constraints in these files

2) finds latest appropriate version of your/awesomelib (5.0.14) and all its dependencies

3) writes "your/awesomelib": "^5.0"

4) writes "your/awesomelib": "5.0.14" and all its dependencies to composer.lock (with hashsums, commit ids and such)

It is a good practice to keep both inside of version control, so when I say "git diff tells me what I did last night" it means that I'll also see what I installed. If (as usual) most of it is some useless trash, I'll manually remove "your/awesomelib" from composer.json, run `composer install` and it will remove it and all its (now unneeded) dependencies. As the result, I never need to worry about bookkeeping, since composer does it for me, I just run `composer require <stuff>` and it does the rest (except for cases when <stuff> is a proprietary repo on company's gitlab and such, then I'll need slightly more manual work).

That is, what I hope to see in Python one day (10 years later than every other lang did it) is declarative package management, except I don't want to have to modify pyproject.toml manually, I want my package manager do it for me, because it saves me 30 seconds of my life every time I install something. Which accumulates to a lot.

rochacon

I believe you're searching for `uv sync`: https://docs.astral.sh/uv/getting-started/features/#projects

With this, you can manage the dependency list via `uv add/remove` (or the `pyproject.toml` directly), and run `uv sync` to add/remove any dependencies to the managed virtual env.

Edit: adding about uv add/uv remove

wisty

I'm not an expert, but as far as I can tell UV allows you to do this without feeling so guilty (it handles multiple versions of Python and libraries AFAIK quite well).

wruza

Why can’t python just adopt something like yarn/pnpm + and effing stop patch-copying its binaries into a specific path? And pick up site_packages from where it left it last time? Wtf. How hard it is to just pick python-modules directory and python-project.json and sync it into correctness by symlink/mklink-inf missing folders from a package cache in there in a few seconds?

Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.

tandav

I'm waiting for this issue to be done: Add an option to store virtual environments in a centralized location outside projects https://github.com/astral-sh/uv/issues/1495

I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.

The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)

forgingahead

Python package management has always seemed like crazyland to me. I've settled on Anaconda as I've experimented with all the ML packages over the years, so I'd be interested to learn why uv, and also what/when are good times to use venv/pip/conda/uv/poetry/whatever else has come up.

NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!

[0]https://news.ycombinator.com/item?id=42677048

agoose77

If you use conda, and can use conda for what you need to do, use conda w/ conda-forge. It has a much better story for libraries with binary dependencies, whereas PyPI (which `uv` uses) is basically full of static libraries that someone else compiled and promises to work.

Note, I use PyPI for most of my day-to-day work, so I say this with love!