Skip to content(if available)orjump to list(if available)

Uv's killer feature is making ad-hoc environments easy

wruza

[delayed]

cyrialize

For anyone that used rye, it's worth noting that the creator of rye recommends using uv. Also, rye is going to be continually updated to just interface with uv until rye can be entirely replaced by uv.

nharada

I really like uv, and it's the first package manager for a while where I haven't felt like it's a minor improvement on what I'm using but ultimately something better will come out a year or two later. I'd love if we standardized on it as a community as the de facto default, especially for new folks coming in. I personally now recommend it to nearly everyone, instead of the "welllll I use poetry but pyenv works or you could use conda too"

poincaredisk

I never used anything other than pip. I never felt the need to use anything other than pip (with virtualenv). Am I missing anything?

NeutralCrane

Couple of things.

- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies

- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.

- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.

Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.

You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.

driggs

Yes, generally people already use an additional tool for managing their Python executables, like their operating system's package manager:

  $> sudo apt-get install python3.10 python3.11 python3.12
And then it's simple to create and use version-specific virtual environments:

  $> python3.11 -m venv .venv3.11
  $> source .venv3.11/bin/activate
  $> pip install -r requirements.txt
You are incorrect about needing to use an additional tool to install a "global" tool like `ruff`; `pip` does this by default when you're not using a virtual environment. In fact, this behavior is made more difficult by tools like `uv` if or `pipx` they're trying to manage Python executables as well as dependencies.

kiddico

Sometimes I feel like my up vote doesn't adequately express my gratitude.

I appreciate how thorough this was.

stavros

Oh wow, it actually can handle the Python executable? I didn't know that, that's great! Although it's in the article as well, it didn't click until you said it, thanks!

PaulHoule

pip's resolving algorithm is not sound. If your Python projects are really simple it seems to work but as your projects get more complex the failure rate creeps up over time. You might

   pip install
something and have it fail and then go back to zero and restart and have it work but at some point that will fail. conda has a correct resolving algorithm but the packages are out of date and add about as many quality problems as they fix.

I worked at a place where the engineering manager was absolutely exasperated with the problems we were having with building and deploying AI/ML software in Python. I had figured out pretty much all the problems after about nine months and had developed a 'wheelhouse' procedure for building our system reliably, but it was too late.

Not long after I sketched out a system that was a lot like uv but it was written in Python and thus had problems with maintaining its own stable Python enivronment (e.g. poetry seems to trash itself every six months or so.)

Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been. (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)

I didn't go forward with developing it because I argued with a lot of people who, like you, thought it was "the perfect being the enemy of the good" when it was really "the incorrect being the enemy of the correct." I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.

MadnessASAP

May I introduce you to our lord and saviour, Nix and it's most holy child nixpkgs! With only a small tithing of your sanity and ability to Interop with any other dependency management you can free yourself of all dependency woes forever!

[] For various broad* definitions of forever.

[*] Like, really, really broad**

[**] Maybe a week if you're lucky

morkalork

Ugh, I hate writing this but that's where docker and microservices comes to the rescue. It's a pain in the butt and inefficient to run but if you don't care about the overhead (and if you do care, why are you still using Python?), it works.

ppierald

Respectively, yes. The ability to create venvs so fast, that it becomes a silent operation that the end user never thinks about anymore. The dependency management and installation is lightning quick. It deals with all of the python versioning

and I think a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>

Your script code would like:

#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "...", # "..." # ] # ///

Then uv will make a new venv, install the dependencies, and execute the script faster than you think. The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.

It is really interesting. You should at least take a look at a YT or something. I think you will be impressed.

Good luck!

amluto

If you switch to uv, you’ll have fewer excuses to take coffee breaks while waiting for pip to do its thing. :)

markerz

Yeah, it unifies the whole env experience with the package installation experience. No more forgetting to activate virtualenv first. No more pip installing into the wrong virtual env or accidentally borrowing from the system packages. It’s way easier to specify which version of python to use. Everything is version controlled including python version and variant like cpython, puppy, etc. it’s also REALLY REALLY fast.

mosselman

I am not a python developer, but sometimes I use python projects. This puts me in a position where I need to get stuff working while knowing almost nothing about how python package management works.

Also I don’t recognise errors and I don’t know which python versions generally work well with what.

I’ve had it happen so often with pip that I’d have something setup just fine. Let’s say some stable diffusion ui. Then some other month I want to experiment with something like airbyte. Can’t get it working at all. Then some days later I think, let’s generate an image. Only to find out that with pip installing all sorts of stuff for airbyte, I’ve messed up my stable diffusion install somehow.

Uv clicked right away for me and I don’t have any of these issues.

Was I using pip and asdf incorrectly before? Probably. Was it worth learning how to do it properly in the previous way? Nope. So uv is really great for me.

hyeonwho4

This is not just a pip problem. I had the problem with anaconda a few years ago where upgrading the built in editor (spyder?) pulled versions of packages which broke my ML code, or made dependencies impossible to reconsile. It was a mess, wasting hours of time. Since then I use one pip venv for each project and just never update dependencies.

loxias

My life got a lot easier since I adopted the habit of making a shell script, using buildah and podman, that wrapped every python, rust, or golang project I wanted to dabble with.

It's so simple!

Create a image with the dependencies, then `podman run` it.

benreesman

Performance and correctness mostly.

2wrist

Also you can set the python version for that project. It will download whatever version you need and just use it.

benatkin

This:

> I haven't felt like it's a minor improvement on what I'm using

means that this:

> I'd love if we standardized on it as a community as the de facto default

…probably shouldn’t happen. The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

It would be like replacing the python repl with the current version of ipython. I’d say the same thing, that it isn’t a minor improvement. While I almost always use ipython now, I’m glad it’s a separate thing.

lolinder

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

The problem is that in the python ecosystem there really isn't a default de facto standard yet at all. It's supposed to be pip, but enough people dislike pip that it's hard as a newcomer to know if it's actually the standard or not.

The nice thing about putting something like this on a pedestal is that maybe it could actually become a standard, even if the standard should be simple and get out of the way. Better to have a standard that's a bit over the top than no standard at all.

benatkin

It feels even more standard than it used to, with python -m pip and python -m venv making it so it can be used with a virtalenv even if only python or python3 is in your path.

greazy

> The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.

This to me is unachievable. Perfection is impossible. On the the way there if the community and developers coalesced around a single tool then maybe we can start heading down the road to perfectionism.

benatkin

I mean, stays out of the way for simple uses.

When I first learned Python, typing python and seeing >>> and having it evaluate what I typed as if it appeared in a file was a good experience.

Now that I use python a lot, ipython is more out of the way to me than the built-in python repl is, because it lets me focus on what I'm working on, than limitations of a tool.

mrbonner

+1 uv now also supports system installation of python with the --default --preview flags. This probably allows me to replace mise (rtx) and go uv full time for python development. With other languages, I go back to mise.

tehjoker

What is the deal with uv's ownership policy? I heard it might be VC backed. To my mind, that means killing pip and finding some kind of subscription revenue source which makes me uneasy.

The only way to justify VC money is a plot to take over the ecosystem and then profit off of a dominant position. (e.g. the Uber model)

I've heard a little bit about UV's technical achievements, which are impressive, but technical progress isn't the only metric.

feznyng

It’s dual MIT and Apache licensed. Worst case, if there’s a rug pull, fork it.

riwsky

Heck, you can get even cleaner than that by using uv’s support for PEP 723’s inline script dependencies:

  # /// script
  # requires-python = ">=3.12"
  # dependencies = [
  #     "pandas",
  # ]
  # ///
h/t https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

shlomo_z

Is it possible for my IDE (vscode) to support this? Currently my IDE screams at me for using unknown packages and I have no type hinting, intellisense, etc.

aeurielesn

I don't understand how things like this get approved into PEPs.

Karupan

Seems like a great way to write self documenting code which can be optionally used by your python runtime.

zanie

As in, you think this shouldn't be possible or you think it should be written differently?

epistasis

The PEP page is really good at explaining the status of the proposal, a summary of the discussion to date, and then links to the actual detailed discussion (in Discourse) about it:

https://peps.python.org/pep-0723/

noitpmeder

I see this was accepted (I think?); is the implementation available in a released python version? I don't see an "as of" version on the pep page, nor do lite google searches reveal any official python docs of the feature.

null

[deleted]

misiek08

And people were laughing at PHP comments configuring framework, right?

throwup238

Python was always the late born twin brother of PHP with better hair and teeth, but the same eyes that peered straight into the depths of the abyss.

linsomniac

I don't think this IS a PEP, I believe it is simply something the uv tool supports and as far as Python is concerned it is just a comment.

zanie

No, this is a language standard now (see PEP 723)

epistasis

And for the Jupyter setting, check out Trevor Manz's juv:

https://github.com/manzt/juv

stevage

As a NodeJS developer it's still kind of shocking to me that Python still hasn't resolved this mess. Node isn't perfect, and dealing with different versions of Node is annoying, but at least there's none of this "worry about modifying global environment" stuff.

mdaniel

Caveat: I'm a node outsider, only forced to interact with it

But there are a shocking number of install instructions that offer $(npm i -g) and if one is using Homebrew or nvm or a similar "user writable" node distribution, it won't prompt for sudo password and will cheerfully mangle the "origin" node_modules

So, it's the same story as with python: yes, but only if the user is disciplined

Now ruby drives me fucking bananas because it doesn't seem to have either concept: virtualenvs nor ./ruby_modules

MrJohz

It's worth noting that Node allows two packages to have the same dependency at different versions, which means that `npm i -g` is typically a lot safer than a global `pip install`, because each package will essentially create its own dependency tree, isolated from other packages. In practice, NPM has a deduplication process that makes this more complicated, and so you can run into issues (although I believe other package managers can handle this better), but I rarely run into issues with this.

That said, I agree that `npm i -g` is a poor system package manager, and you should typically be using Homebrew or whatever package manager makes the most sense on your system. That said, `npx` is a good alternative if you just want to run a command quickly to try it out or something like that.

fny

Because you don’t need virtualenvs or ruby_modules. You can have however many versions of the same gem installed it’s simply referenced by a gemfile, so for Ruby version X you are guaranteed one copy of gem version Y and no duplicates.

This whole installing the same dependencies a million times across different projects in Python and Node land is completely insane to me. Ruby has had the only sane package manager for years. Cargo too, but only because they copied Ruby.

Node has littered my computer with useless files. Python’s venv eat up a lot of space unnecessarily too.

andrewmcdonough

Ruby has a number of solutions for this - rvm (the oldest, but less popular these days), rbenv (probably the most popular), chruby/gem_home (lightweight) or asdf (my personal choice as I can use the same tool for lots of languages). All of those tools install to locations that shouldn't need root.

mdaniel

Yes, I am aware of all of those, although I couldn't offhand tell anyone the difference in tradeoffs between them. But I consider having to install a fresh copy of the whole distribution a grave antipattern. I'm aware that nvm and pyenv default to it and I don't like that

I did notice how Homebrew sets env GEM_HOME=<Cellar>/libexec GEM_PATH=<Cellar>/libexec (e.g. <https://github.com/Homebrew/homebrew-core/blob/9f056db169d5f...>) but, similar to my node experience, since I am a ruby outsider I don't totally grok what isolation that provides

PaulHoule

Python has been cleaning up a number of really lethal problems like:

(i) wrongly configured character encodings (suppose you incorporated somebody else's library that does a "print" and the input data contains some invalid characters that wind up getting printed; that "print" could crash a model trainer script that runs for three days if error handling is set wrong and you couldn't change it when the script was running, at most you could make the script start another python with different command line arguments)

(ii) site-packages; all your data scientist has to do is

   pip install --user
the wrong package and they'd trashed all of their virtualenvs, all of their condas, etc. Over time the defaults have changed so pythons aren't looking into the site-packages directories but I wasted a long time figuring out why a team of data scientists couldn't get anything to work reliably

(iii) "python" built into Linux by default. People expected Python to "just work" but it doesn't "just work" when people start installing stuff with pip because you might be working on one thing that needs one package and another thing that needs another package and you could trash everything you're doing with python in the process of trying to fix it.

Unfortunately python has attracted a lot of sloppy programmers who think virtualenv is too much work and that it's totally normal for everything to be broken all the time. The average data scientist doesn't get excited when it crumbles and breaks, but you can't just call up some flakes to fix it. [1]

[1] https://www.youtube.com/watch?v=tiQPkfS2Q84

whatever1

I don’t exactly remember the situation but a user created a python module named error.py.

Then in their main code they imported the said error.py but unfortunately numpy library also has an error.py. So the user was getting very funky behavior.

PaulHoule

... it's tricky. In Java there's a cultural expectation that you name a package like

  package organization.dns.name.this.and.that;
but real scalability in a module system requires that somebody else packages things up as

  package this.and.that;
and you can make the system look at a particular wheel/jar/whatever and make it visible with a prefix you specify like

  package their.this.and.that;
Programmers seem to hate rigorous namespace systems though. My first year programming Java (before JDK 1.0) the web site that properly documented how to use Java packages was at NASA and you still had people writing Java classes that were in the default package.

worik

> Python has been cleaning up a number of really lethal problems like

I wish they would stick to semantic versioning tho.

I have used two projects that got stuck in incompatible changes in the 3.x Python.

That is a fatal problem for Python. If a change in a minor version makes things stop working, it is very hard to recommend the system. A lot of work has gone down the drain, by this Python user, trying to work around that

RobinL

I've only recently started with uv, but this is one thing it seems to solve nicely. I've tried to get into the mindset of only using uv for python stuff - and hence I haven't installed python using homebrew, only uv.

You basically need to just remember to never call python directly. Instead use uv run and uv pip install. That ensures you're always using the uv installed python and/or a venv.

Python based tools where you may want a global install (say ruff) can be installed using uv tool

rented_mule

> Python based tools where you may want a global install (say ruff) can be installed using uv tool

uv itself is the only Python tool I install globally now, and it's a self-contained binary that doesn't rely on Python. ruff is also self-contained, but I install tools like ruff (and Python itself) into each project's virtual environment using uv. This has nice benefits. For example, automated tests that include linting with ruff do not suddenly fail because the system-wide ruff was updated to a version that changes rules (or different versions are on different machines). Version pinning gets applied to tooling just as it does to packages. I can then upgrade tools when I know it's a good time to deal with potentially breaking changes. And one project doesn't hold back the rest. Once things are working, they work on all machines that use the same project repo.

If I want to use Python based tools outside of projects, I now do little shell scripts. For example, my /usr/local/bin/wormhole looks like this:

  #!/bin/sh
  uvx \
      --quiet \
      --prerelease disallow \
      --python-preference only-managed \
      --from magic-wormhole \
      wormhole "$@"

TZubiri

Because node.js isn't a dependency of the Operating system.

Also we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue.

sgarland

This. IME, JS devs rarely have much experience with an OS, let alone Linux, and forget that Python literally runs parts of the OS. You can’t just break it, because people might have critical scripts that depend on the current behavior.

cdaringe

Hot take: pnpm is the best dx, of all p/l dep toolchains, for devs who are operating regularly in many projects.

Get me the deps this project needs, get them fast, then them correctly, all with minimum hoops.

Cargo and deno toolchains are pretty good too.

Opam, gleam, mvn/gradle, stack, npm/yarn, nix even, pip/poetry/whatever-python-malarkey, go, composer, …what other stuff have i used in the past 12 months… c/c++ doesn’t really have a first class std other than global sys deps (so ill refer back to nix or os package managers).

Getting the stuff you need where you need it is always doable. Some toolchains are just above and beyond, batteries included, ready for productivity.

krashidov

Have you used bun? It's also great. Super fast

mdaniel

I swear I'm not trolling: what do you not like about modern golang's dep management (e.g. go.mod and go.sum)?

I agree that the old days of "there are 15 dep managers, good luck" was high chaos. And those who do cutesy shit like using "replace" in their go.mod[1] is sus but as far as dx $(go get) that caches by default in $XDG_CACHE_DIR and uses $GOPROXY I think is great

1: https://github.com/opentofu/opentofu/blob/v1.9.0/go.mod#L271

theogravity

pnpm is the best for monorepos. I've tried yarn workspaces and npm's idea of it and nothing comes close to the DX of pnpm

paularmstrong

What actually, as an end user, about pnpm is better than Yarn? I've never found an advantage with pnpm in all the times I've tried it. They seem very 1:1 to me, but Yarn edges it out thanks to it having a plugin system and its ability to automatically pull `@types/` packages when needed.

Etheryte

I wouldn't really say it's that black and white. It was only recently that many large libraries and tools recommended starting with "npm i -g ...". Of course you could avoid it if you knew better, but the same is true for Python.

miohtama

It is kind of solved, but not default.

This makes a big difference. There is also the social problem of Python community with too loud opinions for making a good robust default solution.

But same has now happened for Node with npm, yarn and pnpm.

zo1

Half the time something breaks in a javascript repo or project, every single damn javascript expert in the team/company tells me to troubleshoot using the below sequence, as if throwing spaghetti on a wall with no idea what's wrong.

Run npm install

Delete node_modules and wait 30minutes because it takes forever to delete 500MB worth of 2 million files.

Do an npm install again (or yarn install or that third one that popped up recently?)

Uninstall/Upgrade npm (or is it Node? No wait, npx I think. Oh well, used to be node + npm, now it's something different.)

Then do steps 1 to 3 again, just in case.

Hmm, maybe it's the lockfile? Delete it, one of the juniors pushed their version to the repo without compiling maybe. (Someone forgot to add it to the gitignore file?)

Okay, do steps 1 to 3 again, that might have fixed it.

If you've gotten here, you are royally screwed and should try the next javascript expert, he might have seen your error before.

So no, I'm a bit snarky here, but the JS ecosystem is a clustermess of chaos and should rather fix it's own stuff first. I have none of the above issues with python, a proper IDE and out of the box pip.

cdaringe

The pain is real. Most of the issues are navigable, but often take careful thought versus some canned recipe. npm or yarn in large projects can be a nightmare. starting with pnpm makes it a dream. Sometimes migrating to pnpm can be rough, because projects that work may rely on incorrect, transitive, undeclared deps actually resolving. Anyway, starting from pnpm generally resolves this sort of chaos.

Most packing managers are developed.

Pnpm is engineered.

It’s one of the few projects I donate to on GitHub

wiseowise

What kind of amateurs are you working with? I’m not a Node.js dev and even I know about npm ci command.

mirekrusin

Sounds like a tale from a decade ago, people now use things like pnpm and tsx.

supakeen

The activation of the virtualenv is unnecessary (one can execute pip/python directly from it), and the configuring of your local pyenv interpreter is also unnecessary, it can create a virtual environment with one directly:

  pyenv virtualenv python3.12 .venv
  .venv/bin/python -m pip install pandas
  .venv/bin/python
Not quite one command, but a bit more streamlined; I guess.

BeeOnRope

Note that in general calling the venv python directly vs activating the venv are not equivalent.

E.g. if the thing you run invokes python itself, it will use the system python, not the venv one in the first case.

astronautas

Indeed, you're right ;).

lukax

Uv also bundles uvx command so you can run Python scripts without installing them manually:

uvx --from 'huggingface_hub[cli]' huggingface-cli

oulipo

And there's also the `uv run script.py` where you can have dependencies indicated as comments in the script, see eg https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

greatgib

Ridiculous post:

The author says that a normal route would be:

   - Take the proper route:

   - Create a virtual environment

   - pip install pandas

   - Activate the virtual environment

   - Run python
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...

In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.

It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...

hamandcheese

You can still use traditional venvs with UV though, if you want.

emiller88

There's so many more!

1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang

FergusArgyll

Yes! since that Simon Willison article, I've slowly been easing all my scripts into just using a uv shebang, and it rocks! I've deleted all sorts of .venvs and whatnot. really useful

throwup238

The uv shebang is definitely the killer feature for me, especially with so much of the AI ecosystem tied up in Python. Before, writing Python scripts was a lot more painful requiring either a global scripts venv and shell scripts to bootstrap them, or a venv per script.

I’m sure it was already possible with shebangs and venv before, but uv really brings the whole experience together for me so I can write python scripts as freely as bash ones.

dingdingdang

Super neat re Willison article.. would something like this work under powershell though?!

mgd020

Whats the point if you have other binary dependencies?

Use Nix for Python version as well as other bin deps, and virtualenv + pip-tools for correct package dependency resolution.

Waiting 4s for pip-tools instead of 1ms for uv doesn't change much if you only run it once a month.

amelius

Sometimes, only a specific wheel is available (e.g. on Nvidia's Jetson platform where versions are dictated by the vendor).

Can uv work with that?

tasn

OK, I'm convinced. I just installed uv. Thanks for sharing!

smallmancontrov

Ditto. This is pretty cool!