Skip to content(if available)orjump to list(if available)

Uv is the best thing to happen to the Python ecosystem in a decade

hardwaregeek

I gotta say, I feel pretty vindicated after hearing for years how Python’s tooling was just fine and you should just use virtualenv with pip and how JS must be worse, that when Python devs finally get a taste of npm/cargo/bundler in their ecosystem, they freaking love it. Because yes, npm has its issues but lock files and consistent installs are amazing

caconym_

There is nothing I dread more within the general context of software development, broadly, than trying to run other people's Python projects. Nothing. It's shocking that it has been so bad for so long.

hardwaregeek

Never underestimate cultural momentum I guess. NBA players shot long 2 pointers for decades before people realized 3 > 2. Doctors refused to wash their hands before doing procedures. There’s so many things that seem obvious in retrospect but took a long time to become accepted

acomjean

You aren’t kidding. Especially if it’s some bioinformatics software that is just hanging out there on GitHub older than a year…

Multicomp

I agree with you wholeheartedly, besides not preferring dynamic programming languages, I would in the past have given python more of a look because of its low barrier to entry...but I have been repulsed by how horrific the development ux story has been and how incredibly painful it is to then distribute the code in a portable ish way.

UV is making me give python a chance for the first time since 2015s renpy project I did for fun.

lynndotpy

I was into Python enough that I put it into my username but this is also my experience. I have had quasi-nightmares about just the bog of installing a Python project.

RobertoG

pfff... "other people projects".. I was not even able to run my own projects until I start using Conda.

mk89

I have used

pip freeze > requirements.txt

pip install -r requirements.txt

Way before "official" lockfile existed.

Your requirements.txt becomes a lockfile, as long as you accept to not use ranges.

Having this in a single tool etc why not, but I don't understand this hype, when it was basically already there.

icedchai

That works for simple cases. Now, update a transitive dependency used by more than one dependency. You might get lucky and it'll just work.

mk89

Not sure how uv helps here, because I am not very familiar with it.

With pip you update a dependency, it won't work if it's not compatible, it'll work if they are. Not sure where the issue is?

auraham

Can you elaborate on this? How is npm/cargo/etc better than pip on this regard?

morkalork

I remember advocating for running nightly tests on every project/service I worked on because inevitably one night one of the transitive dependencies would update and shit would break. And at least with the nightly test it forced it to break early vs when you needed to do something else like an emergency bug fix and ran into then..

bdangubic

it won’t work of course, no one is that lucky :)

2wrist

It is also manages the runtime, so you can pin a specific runtime to a project. It is very useful and worth investigating.

mk89

I think it's a great modern tool, don't get me wrong.

But the main reason shouldn't be the "lockfile". I was replying to the parent comment mainly for that particular thing.

rtpg

As a “pip is mostly fine” person, we would direct the result to a new lock file, so you could still have your direct does and then pin transitives and update

Pips solver could still cause problems in general on changes.

UV having a better solver is nice. Being fast is also nice. Mainly tho it feeling like it is a tool that is maintained and can be improved upon without ripping one’s hair out is a godsend.

ifwinterco

It is indeed fairly simple to implement it, which is why it's so weird that it's never been implemented at a language level

epage

Good luck if you need cross-platform `requirements.txt` files.

mk89

This is a good use case. Not sure how this is typically solved, I guess "requirements-os-version.txt"? A bit redundant and repetitive.

I would probably use something like this: https://stackoverflow.com/questions/17803829/how-to-customiz...

chrisweekly

Webdev since 1998 here. Tabling the python vs JS/etc to comment on npm per se. PNPM is better than npm in every way. Strongest possible recommendation to use it instead of npm; it's faster, more efficient, safer, and more deterministic. See https://pnpm.io/motivation

Ant59

I've gone all-in on Bun for many of the same reasons. Blazingly fast installs too.

https://bun.sh/

ifwinterco

I think at this point everyone on hacker news with even a passing interest in JS has heard of bun, it's promoted relentlessly

nullbyte

I find pnpm annoying to type, that's why I don't use it

bdangubic

alias it to “p”

anp

Might be worth noting that npm didn’t have lock files for quite a long time, which is the era during which I formed my mental model of npm hell. The popularity of yarn (again importing bundled/cargo-isms) seems like maybe the main reason npm isn’t as bad as it used to be.

icedchai

poetry gave us lock files and consistent installs for years. uv is much, much faster however.

beeb

I used poetry professionally for a couple of years and hit so many bugs, it was definitely not a smooth experience. Granted that was probably 3-4 years ago.

teekert

I always loved poetry but then I’d always run into that bug where you can’t use repos with authentication. So I’d always go somewhere else eventually.

Some time O ago I found out it does with with authentication, but their “counter ascii animation” just covers it… bug has been open for years now…

palm-tree

I started using poetry abiut 4 years ago and definitely hit a lot of bugs around that time, but it seems to have improved considerably. That said, my company has largely moved to uv as it does seem easier to use (particularly for devs coming from other languages).

icedchai

I've occasionally run into performance issues and bugs with dependency resolution / updates. Not so much recently, but at a previous company we had a huge monorepo and I've seen it take forever.

ShakataGaNai

I have to agree that there were a lot of good options, but uv's speed is what sets it apart.

Also the ability to have a single script with deps using TOML in the headers super eaisly.

Also Also the ability to use a random python tool in effectively seconds with no faffing about.

rcleveng

and pip-compile before that.

Agree that uv is way way way faster than any of that and really just a joy to use in the simplicity

gigatexal

the thing is I never had issues with virtual environments -- uv just allows me to easily determine what version of python that venv uses.

j2kun

you mean you can't just do `venv/bin/python --version`?

shlomo_z

he means "choose", not "check"

globular-toast

I've been using pip-tools for the best part of a decade. uv isn't the first time we got lock files. The main difference with uv is how it abstracts away the virtualenv and you run everything using `uv run` instead, like cargo. But you can still activate the virtualenv if you want. At that point the only difference is it's faster.

pydry

>finally get a taste of npm

good god no thank you.

>cargo

more like it.

internetter

cargo is better than npm, yes, but npm is better than pip (in my experience)

dekhn

I hadn't paid any attention to rust before uv, but since starting to use uv, I've switched a lot of my performance-sensitive code dev to rust (with interfaces to python). These sorts of improvements really do improve my quality of life significantly.

My hope is that conda goes away completely. I run an ML cluster and we have multi-gigabyte conda directories and researchers who can't reproduce anything because just touching an env breaks the world.

embe42

You might be interested in pixi, which is roughly to conda as uv is to pip (also written in Rust, it reuses the uv solver for PyPI packages)

th0ma5

This is something that uv advocates should pay attention to, there are always contexts that need different assumptions, especially with our every growing and complex pile of libraries and systems.

whimsicalism

I work professionally in ML and have not had to touch conda in the last 7 years. In an ML cluster, it is hopefully containerized and there is no need for that?

BoredPositron

It's still used in edu and research. Haven't seen it in working environments in quite some time as well.

savin-goyal

the topic of managing large dependency chains for ML/AI workloads in a reproducible has been a deep rabbit hole for us. if you are curious, here is some of the work in open domain

https://docs.metaflow.org/scaling/dependencies https://outerbounds.com/blog/containerize-with-fast-bakery

kardos

It would be nice indeed if there was a good solution to multi-gigabyte conda directories. Conda has been reproducible in my experience with pinned dependencies in the environment YAML... slow to build, sure, but reproducible.

PaulHoule

I'd argue bzip compression was a mistake for Conda. There was a time when I had Conda packages made for the CUDA libraries so conda could locally install the right version of CUDA for every project, but boy it took forever for Conda to unpack 100MB+ packages.

kardos

It seems they are using zstd now for .conda packages, eg, bzip is obsoleted, so that should be faster.

gostsamo

As far as I get it, conda is still around because uv is focused on python while conda handles things written in other languages. Unless uv gets much more universal than expected, conda is here to stay.

tempay

There is also pixi (which uses uv for the python side of things) which feels like uv for conda.

jvanderbot

Obligatory: Not only rust would be faster than python, but Rust definitely makes it easy with Cargo. Go, C, C++ should all exhibit the performance you are seeing in uv, if it had been written in one of those languages.

The curmudgeon in me feels the need to point out that fast, lightweight software has always been possible, it's just becoming easier now with package managers.

LeoPanthera

For single-file Python scripts, which 99% of mine seem to be, you can simplify your life immensely by just putting this at the top of the script:

  #!/usr/bin/env -S uv run --script
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [ "modules", "here" ]
  # ///
The script now works like a standalone executable, and uv will magically install and use the specified modules.

d4mi3n

If I were to put on my security hat, things like this give me shivers. It's one thing if you control the script and specified the dependencies. For any other use-case, you're trusting the script author to not install python dependencies that could be hiding all manner of defects or malicious intent.

This isn't a knock against UV, but more a criticism of dynamic dependency resolution. I'd feel much better about this if UV had a way to whitelist specific dependencies/dependency versions.

chatmasta

If you’re executing a script from an untrusted source, you should be examining it anyway. If it fails to execute because you haven’t installed the correct dependencies, that’s an inconvenience, not a lucky security benefit. You can write a reverse shell in Python with no dependencies and just a few lines of code.

maccard

If that’s your concern you should be auditing the script and the dependencies anyway, whether they’re in a lock file or in the script. It’s just as easy to put malicious stuff in a requirements.txt

theamk

Is there anything new that uv gives you here though?

If you don't care about being ecosystem-compliant (and I am sure malware does not), it's only a few lines of Python to download the code and eval it.

renewiltord

This is true. In fact, if the shebang reads `#!/usr/bin/env python3` I can be absolutely sure that the lines:

    import shutil
    shutil.rmtree('/')
aren't in the file so I don't need to read the code. I only read the code when there are dependencies. This is because I have my security hat on that sorted me into the retard house.

p_l

uv can still be redirected to private PyPi mirror, which should be mandatory from security and reliability perspective anyway.

zahlman

As long as your `/usr/bin/env` supports `-S`, yes.

It will install and use distribution packages, to use PyPA's terminology; the term "module" generally refers to a component of an import package. Which is to say: the names you write here must be the names that you would use in a `uv pip install` command, not the names you `import` in the code, although they may align.

This is an ecosystem standard (https://peps.python.org/pep-0723/) and pipx (https://pipx.pypa.io) also supports it.

hugmynutus

> As long as your

linux core utils have supported this since 2018 (coreutils 8.3), amusingly it is the same release that added `cp --reflink`. AFAIK I know you have to opt out by having `POSIX_CORRECT=1` or `POSIX_ME_HARDER=1` or `--pedantic` set in your environment. [1]

freebsd core utils have supported this since 2008

MacOS has basically always supported this.

---

1. Amusingly despite `POSIX_ME_HARDER` not being official a alrge swapt of core utils support it. https://www.gnu.org/prep/standards/html_node/Non_002dGNU-Sta...

moleperson

Why is the ‘-S’ argument to ‘env’ needed? Based on the man page it doesn’t appear to be doing anything useful here, and in practice it doesn’t either.

zahlman

> Based on the man page it doesn’t appear to be doing anything useful here

The man page tells me:

  -S, --split-string=S
         process and split S into separate arguments; used to pass multi‐
         ple arguments on shebang lines
Without that, the system may try to treat the entirety of "uv run --script" as the program name, and fail to find it. Depending on your env implementation and/or your shell, this may not be needed.

See also: https://unix.stackexchange.com/questions/361794

moleperson

Right, I didn’t think about the shebang case being different. Thanks!

Rogach

Without -S, `uv run --script` would be treated as a binary name (including spaces) and you will get an error like "env: ‘uv run --script’: No such file or directory".

-S causes the string to be split on spaces and so the arguments are passed correctly.

globular-toast

You can get uv to generate this and add dependencies to it, rather than writing it yourself.

kardos

> uv will magically install and use the specified modules.

As long as you have internet access, and whatever repository it's drawing from is online, and you may get different version of python each time, ...

tclancy

And electricity and running water and oh the inconvenience. How is this worse than getting a script file that expects you to install modules?

maccard

If I download python project from someone on the same network as me and they have it written in a different python version to me and a requirements.txt I need all those things anyway.

dragonwriter

I mean, if you use == constraints instead of >= you can avoid getting different versions, and if you’ve used it (or other things which combined have a superset of the requirements) you might have everything locally in your uv cache, too.

But, yes, python scripts with in-script dependencies plus uv to run them doesn't change dependency distribution, just streamlines use compared to manual setup of a venv per script.

XorNot

I use this but I hate it.

I want to be able to ship a bundle which needs zero network access to run, but will run.

It is still frustratingly difficult to make portable Python programs.

zmmmmm

    > Instead of 
    >
    > source .venv/bin/activate
    > python myscript.py
    >
    > you can just do
    >
    > > uv run myscript
    >
This is by far the biggest turn off for me. The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

Side rant: yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

dragonwriter

> They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

There is a new standard mechanism for specifying the same things you would specify when setting up a venv with a python version and dependencies in the header of a single file script, so that tooling can setup up the environment and run the script using only the script file itself as a spec.

uv (and PyPA’s own pipx) support this standard.

> yes I get triggered whenever someone tells me "you can just" do this thing that is actually longer and worse than the original.

"uv run myscript" is neither longer nor worse than separately manually building a venv, activating it, installing dependencies into it, and then running the script.

collinmanderson

> The whole point of an environment manager is set the environment so that the commands I run work. They need to run natively how they are supposed to when the environment is set, not put through a translation layer.

The `uv run` command is an optional shortcut for avoiding needing to activate the virtual environment. I personally don't like the whole "needing to activate an environment" before I can run commands "natively", so I like `uv run`. (Actually for the last 10 years I've had my `./manage.py` auto-set up the virtual environment for me.)

The `uv add` / `uv lock` / `uv sync` commands are still useful without `uv run`.

mborsuk

From what I can tell (just started using uv) it doesn't break the original workflow with the venv, just adds the uv run option as well.

aerhardt

I'm surprised by how much I prefer prepending "uv" to everything instead of activating environments - which is still naturally an option if that's what floats your boat.

I also like how you can manage Python versions very easily with it. Everything feels very "batteries-included" and yet local to the project.

I still haven't used it long enough to tell whether it avoids the inevitable bi-yearly "debug a Python environment day" but it's shown enough promise to adopt it as a standard in all my new projects.

zahlman

> how much I prefer prepending "uv" to everything instead of activating environments

You can also prepend the path to the virtual environment's bin/ (or Scripts/ on Windows). Literally all that "activating an environment" does is to manipulate a few environment variables. Generally, it puts the aforementioned directory on the path, sets $VIRTUAL_ENV to the venv root, configures the prompt (on my system that means modifying $PS1) as a reminder, and sets up whatever's necessary to undo the changes (on my system that means defining a "deactivate" function; others may have a separate explicit script for that).

I personally don't like the automatic detection of venvs, or the pressure to put them in a specific place relative to the project root.

> I also like how you can manage Python versions very easily with it.

I still don't understand why people value this so highly, but so it goes.

> the inevitable bi-yearly "debug a Python environment day"

If you're getting this because you have venvs based off the system Python and you upgrade the system Python, then no, uv can't do anything about that. Venvs aren't really designed to be relocated or to have their underlying Python modified. But uv will make it much faster to re-create the environment, and most likely that will be the practical solution for you.

biimugan

Yup. I never even use activate, even though that's what you find in docs all over the place. Something about modifying my environment rubs me the wrong way. I just call ``./venv/bin/python driver.py`` (or ``./venv/bin/driver`` if you install it as a script) which is fairly self-evident, doesn't mess with your environment, and you can call into as many virtualenvs as you need to independently from one another.

``uv`` accomplishes the same thing, but it is another dependency you need to install. In some envs it's nice that you can do everything with the built-in Python tooling.

lelandbatey

I agree, once I learned (early in my programming journey) what the PATH is as a concept, I have never had an environment problem.

However, I also think many people, even many programmers, basically consider such external state "too confusing" and also don't know how they'd debug such a thing. Which I think is a shame since once you see that it's pretty simple it becomes a tool you can use everywhere. But given that people DON'T want to debug such, I can understand them liking a tool like uv.

I do think automatic compiler/interpreter version management is a pretty killer feature though, that's really annoying otherwise typically afaict, mostly because to get non-system wide installs typically seems to require compiling yourself.

bobsomers

Personally, I prefer prepending `uv` to my commands because they're more stateless that way. I don't need to remember which terminal my environment is sourced in, and when copying and pasting commands to people I don't need to worry about what state their terminal is it. It just works.

sirfz

I use mise with uv to automatically activate a project's venv but prefixing is still useful sometimes since it would trigger a sync in case you forgot to do it.

globular-toast

One of the key tenets of uv is virtualenvs should be disposable. So barring any bugs with uv there should never be any debugging environments. Worst case just delete .venv and continue as normal.

j45

This isn't a comment just about Python.. but it should just work. There shouldn't be constant ceremony for getting and keeping environments running.

oblio

There are basically 0 other programming languages that use the "directory/shell integration activated virtual environment", outside of Python.

How does the rest of the world manage to survive without venvs? Config files in the directory. Shocking, really :-)))

zahlman

> Config files in the directory.

The problem is, that would require support from the Python runtime itself (so that `sys.path` can be properly configured at startup) and it would have to be done in a way that doesn't degrade the experience for people who aren't using a proper "project" setup.

One of the big selling points of Python is that you can just create a .py file anywhere, willy-nilly, and execute the code with a Python interpreter, just as you would with e.g. a Bash script. And that you can incrementally build up from there, as you start out learning programming, to get a sense of importing files, and then creating meaningful "projects", and then thinking about packaging and distribution.

whywhywhywhy

The only word in the `source .venv/bin/activate` command that isn't a complete red flag that this was the wrong approach is probably bin. Everything else is so obviously wrong.

source - why are we using an OS level command to activate a programming language's environment

.venv - why is this hidden anyway, doesn't that just make it more confusing for people coming to the language

activate - why is this the most generic name possible as if no other element in a system might need to be called the activate command over something as far down the chain as a python environment

Feels dirty every time I've had to type it out and find it particularly annoying when Python is pushed so much as a good first language and I see people paid at a senior level not understand this command.

roflyear

what happens when you have two projects using different versions of node, etc? isn't that a massive headache?

not that it's great to start with, but it does happen, no?

verdverm

I'd put type annotations and GIL removal above UV without a second thought. UV is still young and I hit some of those growing pains. While it is very nice, I'm not going to put it up there with sliced bread, it's just another package manager among many

zahlman

For that matter, IMX much of what people praise uv for is simply stuff that pip (and venv) can now do that it couldn't back when they gave up on pip. Which in turn has become possible because of several ecosystem standards (defined across many PEPs) and increasing awareness and adoption of those standards.

The "install things that have complex non-Python dependencies using pip" story is much better than several years ago, because of things like pip gaining a new resolver in 2020, but in large part simply because it's now much more likely that the package you want offers a pre-built wheel (and that its dependencies also do). A decade ago, it was common enough that you'd be stuck with source packages even for pure-Python projects, which forced pip to build a wheel locally first (https://pradyunsg.me/blog/2022/12/31/wheels-are-faster-pure-...).

Another important change is that for wheels on PyPI the installer can now obtain separate .metadata files, so it can learn what the transitive dependencies are for a given version of a given project from a small plain-text file rather than having to speculatively download the entire wheel and unpack the METADATA file from it. (This is also possible for source distributions that include PKG-INFO, but they aren't forced to do so, and a source distribution's metadata is allowed to have "dynamic" dependencies that aren't known until the wheel is built (worst case) or a special metadata-only build hook is run (requires additional effort for the build system to support and the developer to implement)).

verdverm

For sure, we see the same thing in the JS ecosystem. New tooling adds some feature, other options implement feature, convergence to a larger common set.

I'm still mostly on poetry

WD-42

As far as impact on the ecosystem I’d say uv is up there. For the language itself you are right. Curious if you’ve come across any real use cases for Gil-less python. I haven’t yet. Seems like everything that would benefit from it is already written in highly optimized native modules.

seabrookmx

> Seems like everything that would benefit from it is already written in highly optimized native modules

Or by asyncio.

WD-42

I'm pretty ignorant about this stuff but I think asyncio is for exactly that, asynchronus I/O. Whereas GIL-less Python would be beneficial for CPU bound programs. My day job is boring so I'm never CPU bound, always IO bound on the database or network. If there is CPU heavy code, it's in Numpy. So I'm not sure if Gil-less actually helps there.

nomel

asyncio is unrelated to the parallelism prevented by the GIL.

rustystump

I second and third this. I HATE python but uv was what made it usable to me. No other language had such a confusing obnoxious setup to do anything with outside of js land. uv made it sane for me.

giancarlostoro

Node definitely needs its own "uv" basically.

jampekka

Type annotations were introduced in 2008 and even type hints over decade ago in Sept 2015.

zacmps

But there has been continual improvement over that time, both in the ecosystem, and in the language (like a syntax for generics).

KaiserPro

typed annotations that are useful.

Currently they are a bit pointless. Sure they aid in documentation, but they are effort and cause you pain when making modifications (mind you with halfarse agentic coding its probably less of a problem. )

What would be better is to have a strict mode where instead of duck typing its pre-declared. It would also make a bunch of things faster (along with breaking everything and the spirit of the language)

I still don't get the appeal of UV, but thats possibly because I'm old and have been using pyenv and venv for many many years. This means that anything new is an attack on my very being.

however if it means that conda fucks off and dies, then I'm willing to move to UV.

KK7NIL

You can get pretty darn close to static typing by using ty (from the same team as uv).

I've been using it professionally and its been a big improvement for code quality.

brcmthrowaway

What happend with GIL-removal

verdverm

You can disable it, here's the PEP, search has more digestible options

https://peps.python.org/pep-0703/

mgh95

As someone who generally prefers not to use python in a production context (I think it's excellent for one-off scripts or cron jobs that require more features then what bash provides), I agree with this sentiment. I recently wrote some python (using uv) and found it to be pleasant and well-integrated with a variety of LSPs.

null

[deleted]

atonse

These rust based tools really change the idea of what's possible (when you can get feedback in milliseconds). But I'm trying to figure out what Astral as a company does for revenue. I don't see any paid products on their website. They even have investors.

So far it seems like they have a bunch of these high performance tools. Is this part of an upcoming product suite for python or something? Just curious. I'm not a full-time python developer.

bruckie

From "So how does Astral plan to make money? " (https://news.ycombinator.com/item?id=44358216):

"What I want to do is build software that vertically integrates with our open source tools, and sell that software to companies that are already using Ruff, uv, etc. Alternatives to things that companies already pay for today. An example of what this might look like [...] would be something like an enterprise-focused private package registry."

There's also this interview with Charlie Marsh (Astral founder): https://timclicks.dev/podcast/supercharging-python-tooling-a... (specifically the "Building a commerical company with venture capital " section)

throwway120385

That doesn't really seem like a way to avoid getting "Broadcommed." Vertically integrated tooling is kind of a commodity.

tabletcorry

Take a look at their upcoming product Pyx to see where revenue can start to come in for paid/hosted services.

https://astral.sh/pyx

null

[deleted]

null

[deleted]

pshirshov

And still there are some annoying issues:

  dependencies = [
      "torch==2.8.0+rocm6.4",
      "torchvision==0.23.0+rocm6.4",
      "pytorch-triton-rocm==3.4.0",
  ...
  ]
There is literally no easy way to also have a configuration for CUDA, you have to have a second config, and, the worse, manually copy/symlink them into the hardcoded pyproject.toml file

jillesvangurp

Python is not my first language but I've always liked it. But project and dependency management was always a bit meh and an afterthought.

Over the years, I've tried venv, conda, pipenv, petry, plain pip with requirements.txt. I've played with uv on some recent projects and it's a definite step up. I like it.

Uv actually fixes most of the issues with what came before and actually builds on existing things. Which is not a small compliment because the state of the art before uv was pretty bad. Venv, pip, etc. are fine. They are just not enough by themselves. Uv embraces both. Without that, all we had was just a lot of puzzle pieces that barely worked together and didn't really fit together that well. I tried making conda + pipenv work at some point. Pipenv shell just makes using your shell state-full just adds a lot of complexity. None of the IDEs I tried figured that out properly. I had high hopes for poetry but it ended up a bit underwhelming and still left a lot of stuff to solve. Uv succeeds in providing a bit more of an end to end solution. Everything from having project specific python installation, venv by default without hassle, dependency management, etc.

My basic needs are simple. I don't want to pollute my system python with random crap I need for some project. So, like uv, I need to have whatever solution deal with installing the right python version. Besides, the system python is usually out of date and behind the current stable version of python which is what I would use for new projects.

kyt

I must be the odd man out but I am not a fan of uv.

1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

3. It does not play well with Docker.

4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

xmprt

Your implication is that pyenv, virtualenv, and pip should be 3 different tools. But for the average developer, these tools are all related to managing the python environment and versions which in my head sounds like one thing. Other languages don't have 3 different tools for this.

pip and virtualenv also add a ton of complexity and when they break (which happens quite often) debugging it is even harder despite them being "battle tested" tools.

j2kun

I think OP's complaint is rather that using `uv` is leaky: now you need to learn all the underlying stuff AND uv as well.

The alternative, of course, is having Python natively support a combined tool. Which you can support while also not liking `uv` for the above reason.

nicce

Python versions and environments can be solved in more reliable abstraction level as well, e.g. if you are heavy Nix user.

throwaway894345

On the other hand, Nix and Bazel and friends are a lot of pain. I'm sure the tradeoff makes sense in a lot of situations, but not needing to bring in Nix or Bazel just to manage dependencies is a pretty big boon. It would be great to see some of the all-in-one build tools become more usable though. Maybe one day it will seem insane that every language ecosystem has its own build tool because there's some all-in-one tool that is just as easy to use as `(car)go build`!

throwaway894345

Yeah, I agree. In particular it seems insane to me that virtualenv should have to exist. I can't see any valid use case for a machine-global pool of dependencies. Why would anyone think it should be a separate tool rather than just the obvious thing that a dependency manager does? I say this as someone with nearly 20 years of Python experience.

It's the same sort of deal with pyenv--the Python version is itself a dependency of most libraries, so it's a little silly to have a dependency manager that only manages some dependencies.

zahlman

I, too, have ~20 years of Python experience.

`virtualenv` is a heavy-duty third-party library that adds functionality to the standard library venv. Or rather, venv was created as a subset of virtualenv in Python 3.3, and the projects have diverged since.

The standard library `venv` provides "obvious thing that a dependency manager does" functionality, so that every dependency manager has the opportunity to use it, and so that developers can also choose to work at a lower level. And the virtual-environment standard needs to exist so that Python can know about the pool of dependencies thus stored. Otherwise you would be forced to... depend on the dependency manager to start Python and tell it where its dependency pool is.

Fundamentally, the only things a venv needs are the `pyvenv.cfg` config file, the appropriate folder hierarchy, and some symlinks to Python (stub executables on Windows). All it's doing is providing a place for that "pool of dependencies" to exist, and providing configuration info so that Python can understand the dependency path at startup. The venvs created by the standard library module — and by uv — also provide "activation" scripts to manipulate some environment variables for ease of use; but these are completely unnecessary to making the system work.

Fundamentally, tools like uv create the same kind of virtual environment that the standard library does — because there is only one kind. Uv doesn't bootstrap pip into its environments (since that's slow and would be pointless), but you can equally well disable that with the standard library: `python -m venv --without-pip`.

> the Python version is itself a dependency of most libraries

This is a strange way of thinking about it IMO. If you're trying to obtain Python libraries, it's normally because you already have Python, and want to obtain libraries that are compatible with the Python you already have, so that you can write Python code that uses the libraries and works under that Python.

If you're trying to solve the problem of deploying an application to people who don't have Python (or to people who don't understand what Python is), you need another layer of wrapping anyway. You aren't going to get end users to install uv first.

knowitnone3

"other languages don't have 3 different tools for this." But other languages DO have 3 different tools so we should do that too!

a_bored_husky

> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

I think there are more cases where pip, pyenv, and virtualenv are used together than not. It makes sense to bundle the features of the three into one. uv does not replace ruff.

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

uv pip is there for compatibility and to facilitate migration but once you are full on the uv workflow you rarely need `uv pip` if ever

> 3. It does not play well with Docker.

In what sense?

> 4. It adds more complexity. You end up needing to understand all of these new environmental variables: `UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

You don't need to touch them at all

dragonwriter

> It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

uv doesn’t try to replace ruff.

> You end up needing to use `uv pip` so it's not even a full replacement for pip.

"uv pip" doesn't use pip, it provides a low-level pip-compatible interface for uv, so it is, in fact, still uv replacing pip, with the speed and other advantages of uv when using that interface.

Also, while I’ve used uv pip and uv venv as part of familiarizing myself with the tool, I’ve never run into a situation where I need either of those low-level interfaces rather than the normal high-level interface.

> It does not play well with Docker.

How so?

pityJuke

There is an optional & experimental code formatting tool within uv (that just downloads riff), which is what OP may be referring to: https://pydevtools.com/blog/uv-format-code-formatting-comes-...

collinmanderson

> 1. It tries to do too many things. Please just do one thing and do it well. It's simultaneously trying to replace pip, pyenv, virtualenv, and ruff in one command.

In my experience it generally does all of those well. Are you running into issues with the uv replacements?

> 2. You end up needing to use `uv pip` so it's not even a full replacement for pip.

What do end up needing to use `uv pip` for?

leblancfg

uv's pip interface is like dipping one toe in the bathtub. Take a minute and try on the full managed interface instead: https://docs.astral.sh/uv/concepts/projects/dependencies. Your commands then become:

- uv add <package_name>

- uv sync

- uv run <command>

Feels very ergonomic, I don't need to think much, and it's so much faster.

tclancy

So I have been doing Python for far too long and have all sort of tooling I've accreted to make Python work well for me across projects and computers and I never quite made the leap to Poetry and was suspicious of uv.

Happened to buy a new machine and decided to jump in the deep end and it's been glorious. I think the difference from your comment (and others in this chain) and my experience is that you're trying to make uv fit how you have done things. Jumping all the way in, I just . . . never needed virtualenvs. Don't really think about them once I sorted out a mistake I was making. uv init and you're pretty much there.

>You end up needing to use `uv pip` so it's not even a full replacement for pip

The only time I've used uv pip is on a project at work that isn't a uv-powered project. uv add should be doing what you need and it really fights you if you're trying to add something to global because it assumes that's an accident, which it probably is (but you can drop back to uv pip for that).

>`UV_TOOL_BIN_DIR`, `UV_SYSTEM_PYTHON`, `UV_LINK_MODE`, etc.

I've been using it for six months and didn't know those existed. I would suggest this is a symptom of trying to make it be what you're used to. I would also gently suggest those of us who have decades of Python experience may have a bit of Stockholm Syndrome around package management, packaging, etc.

brikym

> It tries to do too many things. Please just do one thing and do it well.

I disagree with this principle. Sometimes what I need is a kitset. I don't want to go shopping for things, or browse multiple docs. I just want it taken care of for me. I don't use uv so I don't know if the pieces fit together well but the kitset can work well and so can a la carte.

Narushia

uv has played well with Docker in my experience, from dev containers to CI/CD to production image builds. Would be interested to hear what is not working for you.

The uv docs even have a whole page dedicated to Docker; you should definitely check that out if you haven't already: https://docs.astral.sh/uv/guides/integration/docker/

NewJazz

Idk, for me ruff was more of a game changer. No more explaining why we need both flake8 and pylint (and isort), no more flake8 plugins... Just one command that does it all.

UV is great but I use it as a more convenient pip+venv. Maybe I'm not using it to it's full potential.

collinmanderson

I agree flake8 -> ruff was more of a game changer for me than pip+venv -> uv. I use flake8/ruff for more often than pip/venv.

uv is probably much more of a game changer for beginner python users who just need to install stuff and don't need to lint. So it's a bigger deal for the broader python ecosystem.

zahlman

> Maybe I'm not using it to it's full potential.

You aren't, but that's fine. Everyone has their own idea about how tooling should work and come together, and I happen to be in your camp (from what I can tell). I actively don't want an all-in-one tool to do "project management".

hirako2000

The dependencies descriptor is further structured, a requirements.txt is pretty raw in comparison.

But where it isn't a matter of opinion is, speed. Never met anyone who given then same interface, would prefer a process taking 10x longer to execute.