Skip to content(if available)orjump to list(if available)

uv: An extremely fast Python package and project manager, written in Rust

acheong08

Just a few months back I said I would never use uv. I was already used to venv and pip. No need for another tool I thought.

I now use uv for everything Python. The reason for the switch was a shared server where I did not have root and there were all sorts of broken packages/drivers and I needed pytorch. Nothing was working and pip was taking ages. Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

If you're still holding out, really just spend 5 minutes trying it out, you won't regret it.

tomjakubowski

The absolute killer feature for me of uv is that it's still compatible with all of my old venv-based workflows. Just run `uv venv`.

tetha

For me, the big key was: uv is so much easier to explain and especially use - especially for people who sometimes script something in python and don't do this daily.

pip + config file + venv requires you to remember ~2 steps to get the right venv - create one and install stuff into it, and for each test run, script execution and such, you need to remember a weird shebang-format, or to activate the venv. And the error messages don't help. I don't think they could help, as this setup is not standardized or blessed. You just have to beat a connection of "Import Errors" to venvs into your brain.

It's workable, but teaching this to people unfamiliar with it has reminded me how.. squirrely the whole tooling can be, for a better word.

Now, team members need to remember "uv run", "uv add" and "uv sync". It makes the whole thing so much easier and less intimidating to them.

psychoslave

I wonder how it compares with something more green generalist like "mise", to which I migrated after using "ASDF" for some time.

codethief

Similarly to the sibling I also use both. I let mise manage my uv version (and other tools) and let uv handle Python + PyPI Packages for me. Works great!

There's also some additional integration which I haven't tried yet: https://mise.jdx.dev/mise-cookbook/python.html#mise-uv

wrboyce

I use both! uv installed globally with mise, and uv tools can then be managed via “miss use -g pipx:foo”.

yjftsjthsd-h

> Each user had 10GB of storage allocated and pip's cache was taking up a ton of space & not letting me change the location properly. Switched to uv and everything just worked

Is it better about storage use? (And if so, how? Is it just good at sharing what can be shared?)

fsh

uv hardlinks identical packages, so adding virtual envs takes up very little space.

snerbles

Unless you cross mount points, which uv will helpfully warn about.

acheong08

Both pip and uv cache packages to ~/.cache. Uv lets you change it to /tmp and symlink instead of copying

kissgyorgy

There is a global cache for all installed packages in the user home cache dir.

oofbey

I love uv. The one gotcha I'll warn people about is: don't touch uvx. I've lost an embarrassing number of hours or days trying to figure out why nothing works properly or makes sense when I tried to run things with uvx. I guess I understand why it's there, but I think it's a built-in foot-gun and not well documented. But if you stay away from it, things work great.

mistrial9

similar story recently with an experimental repo that starts with "its so easy, just $uv a b c" .. under the hood it implies a lot of redundancies? but true enough it worked fine and trouble-free too, on a standard GNU-Debian-Ubuntu host

_vya7

I remember using pip and venv back in like 2009. Last time I checked, maybe 5 or 10 years ago, the recommendation of the community was generally to just use Docker instead of all these tools. Did that not catch on?

unclad5968

The advice seems to change every year. For a while it was venv, then pipenv, poetry, docker, and now uv. Maybe the ecosystem will settle on that but who knows.

ed_elliott_asc

I came here to comment that I don’t see any reason to bother - that’s for the comment, I will try it now!

polivier

The first time I used `uv`, I was sure that I had made a mistake or typed something wrong because the process finished so much more quickly than anything I had ever experienced with `pip`.

tux3

I've sometimes had uv take up to 200ms to install packages, so you could feel a slight delay between pressing enter and the next shell prompt

You don't have that problem with Poetry. You go make a cup of coffee for a couple minutes, and it's usually done when you come back.

Numerlor

It's funny when the exact same thing was probably said about pipenv and poetry

baby

Same here lol! The experience is so smooth it doesn't feel like python

johnfn

That makes sense, because it's Rust. :)

augustflanagan

I just had this same experience last week, and was certain it wasn’t working correctly as well. I’m a convert.

nialse

Likewise. I was skeptical, then I tried it and won’t go back.

theLiminator

uv and ruff are a great counterexample to all those people who say "never reinvent the wheel". Don't ever do it just for the sake of doing it, but if you have focused goals you can sometimes produce a product that's an order of magnitude better.

CrendKing

I believe most of the time this phrase is said to an inexperienced artisan who has no idea how the current system works, what's the shortcoming of it, and how to improve upon it. Think of an undergraduate student who tries to solve the Goldbach conjecture. Usually what ended up is either he fails to reinvent the wheel, or reinvent the exact same wheel, which has no value. The phrase certainly does not apply to professionals.

eviks

They didn't reinvent the wheel, "just" replaced all the wood with more durable materials to make it handle rotation at 10 times the speed

socalgal2

I'd be curious to know exactly what changed. Python -> Rust won't make network downloads faster nor file I/O faster. My naive guess is that all the speed comes from choosing better algorithms and/or parallelizing things. Not from Python vs Rust (though if it's hard to parallelize in Python and easy in rust that would certainly make a difference)

ekidd

I've translated code from Ruby to Python, and other code from Rust to Python.

Rust's speed advantages typically come from one of a few places:

1. Fast start-up times, thanks to pre-compiled native binaries.

2. Large amounts of CPU-level concurrency with many fewer bugs. I'm willing to do ridiculous threading tricks in Rust I wouldn't dare try in C++.

3. Much lower levels of malloc/free in Rust compared to some high-level languages, especially if you're willing to work a little for it. Calling malloc in a multithreaded system is basically like watching the Millennium Falcon's hyperdrive fail. Also, Rust encourages abusing the stack to a ridiculous degree, which further reduces allocation. It's hard to "invisibly" call malloc in Rust, even compared to a language like C++.

4. For better or worse, Rust exposes a lot of the machinery behind memory layout and passing references. This means there's a permanent "Rust tax" where you ask yourself "Do I pass this by value or reference? Who owns this, and who just borrows is?" But the payoff for that work is good memory locality.

So if you put in a modest amount of effort, it's fairly easy to make Rust run surprisingly fast. It's not an absolute guarantee, and there are couple of traps for the unwary (like accidentally forgetting to buffer I/O, or benchmarking debug binaries).

the8472

NVMe hungers, keeping it fed is hard work. Doing some serial read, decompress, checksum, write loop will leave if starved (QD<1) whenever you're doing anything but the last step. Disk IO isn't async unless you use io_uring (well ok, writeback caches can be). So threads are almost a must to keep NVMe busy. Conversely, waiting for blocking IO (e.g. directory enumeration) will keep your CPU starved. Here too the answer is more threads.

captnswing

Extremely interesting presentation from Charlie Marsh about all the optimizations https://youtu.be/gSKTfG1GXYQ?si=CTc2EwQptMmKxBwG

jerpint

From just my observations they basically parallelized the install sequence instead of having it be sequential (among many other optimizations most likely)

physicsguy

The package resolution is a big part of it, it's effectively a constraint solver. I.e. if package A requires package B constrained between version 1.0 < X <= 2.X and Package B requires package C between... and so on and so on.

Conda rewrote their package resolver for similar reasons

globular-toast

There is a talk about it from one of the authors here: https://www.youtube.com/watch?v=gSKTfG1GXYQ

tl;dw Rust, a fast SAT solver, micro-optimisation of key components, caching, and hardlinks/CoW.

jerf

It became a bit of a meme, especially in the web development space, that all programs are always waiting on external resources like networks, databases, disks, etc., and so scripting languages being slower than other languages doesn't matter and they'll always be as fast as non-scripting languages.

Even on a single core, this turns out to be simply false. It isn't that hard to either A: be doing enough actual computation that faster languages are in fact perceptibly faster, even, yes, in a web page handler or other such supposedly-blocked computation or B: without realizing it, have stacked up so many expensive abstractions on top of each other in your scripting language that you're multiplying the off-the-top 40x-ish slower with another set of multiplicative penalties that can take you into effectively arbitrarily-slower computations.

If you're never profiled a mature scripting language program, it's worth your time. Especially if nobody on your team has ever profiled it before. It can be an eye-opener.

Then it turns out that for historical path reasons, dynamic scripting languages are also really bad at multithreading and using multiple cores, and if you can write a program that can leverage that you can just blow away the dynamic scripting languages. It's not even hard... it pretty much just happens.

(I say historical path reasons because I don't think an inability to multithread is intrinsic to the dynamic scripting languages. It's just they all came out in an era when they could assume single core, it got ingrained into them for a couple of decades, and the reality is, it's never going to come fully out. I think someone could build a new dynamic language that threaded properly from the beginning, though.)

You really can see big gains just taking a dynamic scripting language program and turning it into a compiled language with no major changes to the algorithms. The 40x-ish penalty off the top is often in practice an underestimate, because that number is generally from highly optimized benchmarks in which the dynamic language implementation is highly tuned to avoid expensive operations; real code that takes advantage of all the conveniences and indirection and such can have even larger gaps.

This is not to say that dynamic scripting languages are bad. Performance is not the only thing that matters. They are quite obviously fast enough for a wide variety of tasks, by the strongest possible proof of that statement. That said, I think it is the case that there are a lot of programmers who have no idea how much performance they are losing in dynamic scripting languages, which can result in suboptimal engineering decisions. It is completely possible to replace a dynamic scripting language program with a compiled one and possibly see 100x+ performance improvements on very realistic code, before adding in multithreading. It is hard for that not to manifest in some sort of user experience improvement. My pitch here is not to give up dynamic scripting languages, but to have a more realistic view of the programming language landscape as a whole.

doug_durham

A big part of the "magic" is that there is a team of paid professionals maintaining and improving it. That's more important than it being written in Rust. If uv were forked it would devolve to the level of pip over time.

0cf8612b2e1e

The history of Python package management is clear that everyone thinks they can do a better job than the status quo.

psunavy03

In this case, they were right.

nickelpro

uv is purely a performance improvement, it changes nothing about the mechanics of Python environment management or packaging.

The improvements came from lots of work from the entire python build system ecosystem and consensus building.

0cf8612b2e1e

Disagree in that uv makes switching out the underlying interpreter so straightforward. Becomes trivial to swap from say 3.11 to 3.12. The pybi idea.

Sure, other tools could handle the situation, but being baked into the tooling makes it much easier to bootstrap different configurations.

globular-toast

Actually not true. One of the main differences with uv is you don't have to think about venvs any more. There's a talk about it from one of the authors at a recent PyCon here: https://www.youtube.com/watch?v=CV8KRvWKYDw (not the same talk I linked elsewhere in the thread).

henry700

Of course they do, this tends to happen when the history is it being hot flaming garbage.

mort96

Honestly "don't reinvent the wheel" makes absolutely no sense as a saying. We're not still all using wooden discs as wheels, we have invented much better wheels since the neolithic. Why shouldn't we do the same with software?

simonw

When asked why he had invented JSON when XML already existed, Douglas Crockford said:

The good thing about reinventing the wheel is that you can get a round one.

https://scripting.wordpress.com/2006/12/20/scripting-news-fo...

idle_zealot

You can get a round one. Or you can make yet another wonky shaped one to add to the collection, as ended up being the case with JSON.

haiku2077

Right, wheels are reinvented every few years. Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye.

My primary vehicle has off-road capable tires that offer as much grip as a road-only tire would have 20-25 years ago, thanks to technology allowing Michelin to reinvent what a dual-purpose tire can be!

nightpool

> Compare tires of today to the ones 20 years ago and the technology and capability is very different, even though they look identical to a casual eye

Can you share more about this? What has changed between tires of 2005 and 2025?

aalimov_

I always took this saying as meaning that we don’t re-invent the concept of the wheel. For example the Boring company and Tesla hoping to reinvent the concept of the bus/train.. (iirc your car goes underground on some tracks and you get to bypass traffic and not worry about steering)

A metal wheel is still just a wheel. A faster package manager is still just a package manager.

haiku2077

That's not how I've ever seen it used in practice. People use it to mean "don't build a replacement for anything functional."

jjtheblunt

> an order of magnitude better

off topic, but i wonder why that phrase gets used rather than 10x which is much shorter.

BeetleB

Short answer: Because the base may not be 10.

Long answer: Because if you put a number, people expect it to be accurate. If it was 6x faster, and you said 10x, people may call you out on it.

screye

It's meant to signify a step change. Order of magnitude change = no amount of incremental changes would make up for it.

In common conversation, the multiplier can vary from 2x - 10x. In context of some algorithms, order of magnitudes can be over the delta rather than absolutes. eg: an algorithms sees 1.1x improvement over the previous 10 years. A change that shows a 1.1x improvement by itself, overshadows an an order-of-magnitude more effort.

For salaries, I've used order-of-magnitude to mean 2x. Good way to show a step change in a person's perceived value in the market.

bxparks

I think of "an order of magnitude" as a log scale. It means somewhere between 3.16X and 31.6X.

jjtheblunt

yeah that's what i meant with 10x, like it's +1 on the exponent, if base is 10. but i'm guessing what others are thinking, hence the question.

fkyoureadthedoc

- sounds cooler

- 10x is a meme

- what if it's 12x better

bmacho

Because it's not 10x?

Scene_Cast2

10x is too precise.

chuckadams

Because "magnitude" has cool gravitas, something in how it's pronounced. And it's not meant to be precise, it just means "a whole lot more".

refulgentis

"10x" has been cheapened / heard enough / de facto, is a more general statement than a literal interpretation would indicate. (i.e. 10x engineer. Don't hear that much around these parts these days)

Order of magnitude faces less of that baggage, until it does :)

psunavy03

Would you say it faces . . . orders of magnitude less baggage?

larkost

Just a warning in case others run into it: on very anemic systems (e.g.: AWS T2.micro running Windows, yes... I know...) uv will try to open too many simultaneous downloads, overloading things, resulting in timeouts.

You can use ent ENV variable UV_CONCURRENT_DOWNLOADS to limit this. In my case it needed to be 1 or 2. Anything else would cause timeouts.

An extreme case, I know, but I think that uv is too aggressive here (a download thread for every module). And should use aggregate speeds from each source server as a way of auto-tuning per-server threading.

ehsankia

Not extreme at all, A lot of people use the cheapest smallest VPS for their hobby work. I know I do (albeit not AWS). Thanks for sharing, hope they improve the automatic detection there.

mh-

Started using this recently for personal stuff on my laptop. When you're used to pip, it's just confusingly fast. More than once I thought maybe it didn't work because it returned too quickly..

leonheld

I adore the

  uv add <mydependencies> --script mycoolscript.py
And then shoving

  #!/usr/bin/env -S uv run
on top so I can run Python scripts easily. It's great!

simonw

I built a Claude Project with special instructions just teaching it how to do this, which means it can output full scripts for me with inline dependencies based on a single prompt: https://simonwillison.net/2024/Dec/19/one-shot-python-tools/

Claude 4's training cutoff date is March 2025 though, I just checked and it turns out Claude Sonnet 4 can do this without needing any extra instructions:

  Python script using uv and inline script dependecies
  where I can give it a URL and it scrapes it with httpx
  and beautifulsoup and returns a CSV of all links on
  the page - their URLs and their link text
Here's the output, it did the right thing with regards to those dependencies: https://claude.ai/share/57d5c886-d5d3-4a9b-901f-27a3667a8581

sunaookami

Using your system instructions for uv for every LLM now since first seeing your post last year, thanks! It's insanely helpful just asking e.g. Claude to give me a python script for XYZ and just using "uv run". I also added:

  If you need to run these scripts, use "uv run script-name.py". It will automatically install the dependencies. Stdlibs don't need to be specified in the dependencies array.
since e.g. Cursor often gets confued because the dependencies are not installed and it doesn't know how to start the script. The last sentence is for when LLMs get confused and want to add "json" for example to the dependency array.

varunneal

claude sonnet typically forgets about uv script syntax in my experience. I usually find myself having to paste in the docs every time. By default it wants to use uv project syntax.

jsilence

Using this trick with Marimo.io notebooks in app-mode.

Instant reactive reproducible app that can be sent to others with minimal prerequisites (only uv needs to be installed).

Such a hot combo.

intellectronica

It's so cool. I now habitually vibe-code little scripts that I can immediately run. So much nicer than having to manage environments and dependencies:

- https://everything.intellectronica.net/p/the-little-scripter

- https://www.youtube.com/watch?v=8LB7e2tKWoI

- https://github.com/intellectronica/ez-mcp

kristjansson

e: I misread your example, disregard below irrelevant pattern matching of 'uv add --script' to 'uv add' in the project context!

~~That mutates the project/env in your cwd. They have a lot in their docs, but I think you’d like run --with or uv’s PEP723 support a lot more~~

https://docs.astral.sh/uv/guides/scripts/

misnome

PEP723 support is exactly what the poster is using?

kristjansson

Ach, missed the --script, thanks.

eats_indigo

Love UV!

Also love Ruff from the Astral team. We just cut our linting + formatting across from pylint + Black to Ruff.

Saw lint times drop from 90 seconds to < 1.5 seconds. crazy stuff.

greatgib

Until the moment you will realize that ruff perform only a part of pylint checks and that very obvious mistakes can go through easily like code that can't run because of an obvious error.

nrvn

this is my new fav for running small executable scripts:

  #!/usr/bin/env -S uv --quiet run --script
  # /// script
  # requires-python = ">=3.13"
  # dependencies = [
  #     "python-dateutil",
  # ]
  # ///
  #
  # [python script that needs dateutil]

mdeeks

I really wish that hashbang line was something way WAY easier to remember like `#!/usr/bin/env uvx`. I have to look this up every single time I do it.

PufPufPuf

Sadly hashbangs are technically limited to: 1) Support only absolute paths, making it necessary to use /usr/bin/env which is in standardized location to look up the uv binary 2) Support only a single argument (everything after the space is passed as a single arg, it's not parsed into multiple args like a shell would), making it necessary to use -S to "S"plit the arguments. It's a feature of env itself, for this very use case.

So there isn't really much to do to make it simpler.

mdeeks

I wasn't really referring to env. I meant change the behavior of uvx. If the first argument passed to uvx is a file path, then execute it exactly the same way as `uv --quiet run --script` does.

Or maybe create a second binary or symlink called `uvs` (short for uv script) that does the same thing.

esafak

It's not just the programs that is fast, but the founder. I reported an issue today and he fixed it right away.

pu_pe

Tried uv a while ago and I was shocked by how fast and easy it is to use. There's basically no reason to use pip anymore, and if you're using only Python there's basically no reason to use conda either.

oceansky

It seems to make pyenv and poetry droppable too.

findalex

and pipx.

samsartor

I'm in ML-land. I thought we were all hopelessly tied to conda. But I moved all my own projects to uv effortlesly and have never looked back. Now first thing I do when pulling down another reseacher's code is add a pyproject toml (if they don't have one), `uv add -r`, and `uv run` off into the sunset. I especially like how good uv is with non-pypi-published dependencies: GitHub, dumb folders, internal repos, etc.

6ak74rfy

UV is fast, like FAST. Plus, it removes the need for pyenv (for managing different Python versions) and pip for me. Plus, no need to activate env or anything, `uv run ...` automatically runs your code through the env.

It's a nice software.

nomel

> Plus, it removes the need for pyenv

I don't see a way to change current and global versions of python/venvs to run scripts, so that when I type "python" it uses that, without making an alias.

zbentley

Two options other than aliases:

1. Put this in a file called "python" early your PATH:

    #!/bin/sh
    exec uv run python $*
2. Hand-modify your path:

    export PATH="$(uv run python -c 'import os; import sys; print(os.path.dirname(sys.executable))'):$PATH"

adamckay

If they're your scripts (i.e. your writing/editing them) then you can declare dependencies following the PEP723 format and uv will respect that.

https://docs.astral.sh/uv/guides/scripts/#declaring-script-d...

nomel

> uv run example.py

I specifically want to run "python", rather subcommands for some other command, since I often I want to pass in arguments to the python interpreter itself, along with my script.