Skip to content(if available)orjump to list(if available)

Switching from Pyenv to Uv

Switching from Pyenv to Uv

238 comments

·March 9, 2025

quickslowdown

I highly, highly recommend uv. It solves & installs dependencies incredibly fast, and the CLI is very intuitive once you've memorized a couple commands. It handles monorepos well with the "workspaces" concept, it can replace pipx with "uv tool install," handle building & publishing, and the docker image is great, you just add a FROM line to the top and copy the bin from /uv.

I've used 'em all, pip + virtualenv, conda (and all its variants), Poetry, PDM (my personal favorite before switching to uv). Uv handles everything I need in a way that makes it so I don't have to reach for other tools, or really even think about what uv is doing. It just works, and it works great.

I even use it for small scripts. You can run "uv init --script <script_name.py>" and then "uv add package1 package2 package3 --script <script_name.py>". This adds an oddly formatted comment to the top of the script and instructs uv which packages to install when you run it. The first time you run "uv run <script_name.py>," uv installs everything you need and executes the script. Subsequent executions use the cached dependencies so it starts immediately.

If you're going to ask me to pitch you on why it's better than your current preference, I'm not going to do that. Uv is very easy to install & test, I really recommend giving it a try on your next script or pet project!

actinium226

The script thing is great. By the way those 'oddly formatted' comments at the top are not a uv thing, it's a new official Python metadata format, specifically designed to make it possible for 3rd party tools like uv to figure out and install relevant packages.

And in case it wasn't clear to readers of your comment, uv run script.py creates an ephemeral venv and runs your script in that, so you don't pollute your system env or whatever env you happen to be in.

fluidcruft

I generally agree but one thing I find very frustrating (i.e. have not figured out yet) is how deal with extras well, particularly with pytorch. Some of my machines have GPU, some don't and things like "uv add" end up uninstalling everything and installing the opposite forcing a resync with the appropriate --extra tag. The examples in the docs do things like CPU on windows and GPU on Linux but all my boxes are linux. There has to be a way to tell it that "hey I want --extra GPU" always on this box. But I haven't figured it out yet.

shawnz

Getting the right version of PyTorch installed to have the correct kind of acceleration on each different platform you support has been a long-standing headache across many Python dependency management tools, not just uv. For example, here's the bug in poetry regarding this issue: https://github.com/python-poetry/poetry/issues/6409

As I understand it, recent versions of PyTorch have made this process somewhat easier, so maybe it's worth another try.

fluidcruft

uv actually handles thr issues described there very well (uv docs have have a page showing a few ways to do it). The issue for me is uv has massive amnesia about which one was selected and you end up trashing packages because of that. uv is very fast at thrashing though so it's not as bad as if poetry were thrashing.

tmaly

I end up going to the torch website and they have a nice little UI I can click what I have and it gives me the pip line to use.

amelius

On nvidia jetson systems, I always end up compiling torchvision, while torch always comes as a wheel. It seems so random.

DrBenCarson

It sounds like you’re just looking for dependency groups? uv supports adding custom groups (and comes with syntactic sugar for a development group

fluidcruft

It is... but basically it need to remember which groups are sync'd. For example if you use an extra, you have to keep track of it constantly because sync thrashes around between states all the time unless you play close and tedious attention. At least I haven't figured out how to make it remember which extras are "active".

    uv sync --extra gpu
    uv add matplotlib # the sync this runs undoes the --extra gpu
    uv sync # oops also undoes all the --extra
What you have to do to avoid this is to remember to use --no-sync all the time and then meticulously manually sync while remembering all the extras that I do actually currently want:

    uv sync --extra gpu --extra foo --extra bar
    uv add --no-sync matplotlib
    uv sync --extra gpu --extra foo --extra bar
It's just so... tedious and kludgy. It needs an "extras.lock" or "sync.lock" or something. I would love it if someone tells me I'm wrong and missing something obvious in the docs.

0xcoffee

fluidcruft

I haven't tried it yet but that looks like exactly what I've been missing.

quickslowdown

You can control dependencies per platform

https://docs.astral.sh/uv/concepts/projects/dependencies/#pl...

Not sure if it's as granular as you might need

satvikpendem

This happened to me too, that is why I stopped using it for ML related projects and stuck to good old venv. For other Python projects I can see it being very useful however.

ibic

I'm not sure if I got your issue, but I can do platform-dependent `index` `pytorch` installation using the following snippet in `pyproject.toml` and `uv sync` just handles it accordingly.

[tool.uv.sources] torch = [{ index = "pytorch-cu124", marker = "sys_platform == 'win32'" }]

satvikpendem

Some Windows machines have compatible GPUs while others don't, so this doesn't necessarily help. What is really required is querying the OS for what type of compute unit it has and then installing the right version of an ML library, but I'm not sure that will be done.

synergy20

i use uv+torch+cuda on linux just fine,never used the extra flag, i wonder what's the problem here?

dagw

Getting something that works out of the box on just your computer is normally fine. Getting something that works out of the box on many different computers with many different OS and hardware configurations is much much harder.

baby_souffle

I didn't know that UV would now edit the script for you. That is just icing on the cake!

For the curious, the format is codified here: https://peps.python.org/pep-0723/

midhun1234

Can confirm this is all true. I used to be the "why should I switch" guy. The productivity improvement from not context switching while pip installs a requirements file is completely worth it.

scribu

The install speed alone makes it worthwhile for me. It went from minutes to seconds.

BoorishBears

I was working on a Raspberry Pi at a hackathon, and pip install was eating several minutes at a time.

Tried uv for the first time and it was down to seconds.

guappa

Why would you be redoing your venv more than once?

mcintyre1994

That scripting trick is awesome! One of the really nice things about Elixir and its dependency manager is that you can just write Mix.install(…) in your script and it’ll fetch those dependencies for you, with the same caching you mentioned too.

Does uv work with Jupyter notebooks too? When I used it a while ago dependencies were really annoying compared to Livebook with that Mix.install support.

uasi

uv offers another useful feature for inline dependencies, which is the exclude-newer field[1]. It improves reproducibility by excluding packages released after a specified date during dependency resolution.

I once investigated whether this feature could be integrated into Mix as well, but it wasn't possible since hex.pm doesn't provide release timestamps for packages.

> Does uv work with Jupyter notebooks too?

Yes![2]

[1] https://docs.astral.sh/uv/guides/scripts/#improving-reproduc... [2] https://docs.astral.sh/uv/guides/integration/jupyter/

para_parolu

As a person who don’t work often on python code but occasionally need to run server or tool I find UV blessing. Before that I would beg people to help me just not to figure out what combination of obscure python tools I need. Now doing “uv run server.py” usually works.

insane_dreamer

uv is great and we’re switching over from conda for some projects. The resolver is lightning fast and the toml support is good.

Having said that, there are 2 areas where we still need conda:

- uv doesn’t handle non-python wheels, so if you need to use something like mkl, no luck

- uv assumes that you want to use one env per project. However with complex projects you may need to use a different env with different branches of your code base. Conda makes this easy - just activate the conda env you want — all of your envs can be stored in some central location outside your projects — and run your code. Uv wants to use the project toml file and stores the packages in .venv by default (which you don’t want to commit but then need different versions of). Yes you can store your project venv elsewhere with an env var but that’s not a practical solution. There needs to be support for multiple .toml files where the location of the env can be specified inside the toml file (not in an env var).

serjester

You may want to checkout uv’s workspaces - they’re very handy for large mono repos.

insane_dreamer

Thanks. I looked at that but I believe it solves a different problem.

bogdart

You can create another venv in the same folder with different name. ‘uv venv my-name’ does the thing.

IshKebab

Uv really fixes Python. It takes it from "oh god I have to fight Python again" to "wow it was actually fast and easy".

I think all the other projects (pyenv, poetry, pip, etc.) should voluntarily retire for the good of Python. If everyone moved to Uv right now, Python would be in a far better place. I'm serious. (It's not going to happen though because the Python community has no taste.)

The only very minor issue I've had is once or twice the package cache invalidation hasn't worked correctly and `uv pip install` installed an outdated package until I `uv clean`ed. Not a big deal though considering it solves so many Python clusterfucks.

javchz

Agree. I mostly do front end in my day job, and despite JavaScript being a bit of a mess lang, dealing with npm is way better than juggling anaconda, miniforge, Poetry, pip, venv, etc depending on the project.

UV is such a smooth UX that it makes you wonder how something like it wasn’t part of Python from the start.

baq

+1

…but we did have to wait for cargo, npm (I include yarn and pnpm here) and maybe golang to blaze the ‘this is how it’s done’ trail. Obvious in hindsight.

dontlaugh

Ruby's bundler had already invented the correct model many years ago. It only took time for others to accept that.

Aeolun

More importantly, migrating from npm, to pnpm, to yarn, to bun, is very nearly seamless. Migrating in the Python ecosystem? Not anywhere close.

woodrowbarlow

standardizing pyproject.toml helped but it didn't go quite far enough.

EdwardDiego

Feels like you're doing it wrong if you're dealing with all of those.

IshKebab

"depending on the project"

zelphirkalt

You mean off the job you have to juggle all those tools? On the job that would be kind of crazy, to allow every project its own tool chain.

dilawar

True.

I had to give up on mypy and move to pyright because mypy uses pip to install missing types and they refuse to support uv. In the CI pipeline where I use UV, I don't have a pip installed so mypy complains about missing pip.

Of course I can do it by myself by adding typing pkgs to requirement.txt file then what's the point of devtools! And I don't want requirements.txt when I already got pyproject.toml.

Once you get used to cargo from rust, you just can't tolerate shitty tooling anymore. I used to think pip was great (compared to C++ tooling).

WhyNotHugo

Mypy doesn't install anything by default, you're probably setting the `--install-types` flag somehow.

IshKebab

Pyright is waaay better than Mypy anyway so I'd say they did you a favour.

tacitusarc

I think their only big gap is the inability to alias general project non-python scripts in uv. This forces you to use something like a justfile or similar and it would be much more ergonomic to just keep it all in uv.

guappa

uv belongs to a startup. They will surely introduce some wacky monetisation scheme sooner or later.

I wouldn't get too used to it.

IshKebab

Maybe, but even if that is the case it's sooooo much better that even the worst case (fork when they try to monetise it) is way better than any alternatives.

Hackbraten

> fork when they try to monetise it

To maintain a successful fork, not only are you going to need to find people who volunteer for maintaining a fork at that scale (including a large user base due to popularity), you’ll need to find skilled Rust developers, too.

That’s going to be immensely difficult.

loeber

Strong agree. The respectful act of other package managers would be consider themselves deprecated and point to uv instead.

baq

The risk is obviously uv losing funding. I kinda hope the PSF has thought about this and has a contingency plan for uv winning and dying/becoming enshittified soon after.

guappa

If they never made any plan about how modules are installed and there is no official way… i doubt they made a plan about uv.

loeber

It's open source. If necessary, uv can be forked and maintained entirely as OSS.

tootie

Every time people have debates over the merits of languages I always put developer environment at the top of my list. Build tools, IDE, readable stack traces. Those things boost productivity for more than concise list comprehensions or any gimmicky syntax thing. It's why Python always felt stone age to me despite have such lovely semantics.

albert_e

I am sold. Sign me up.

I have never used virtual environments well -- the learning curve after dealing with python installation and conda/pip setup and environment variables was exhausting enough. Gave up multiple times or only used them when working through step wise workshops.

If anyone can recommend a good learning resource - would love to take a stab again.

kyawzazaw

Which companies run heavily (either solely or huge parts) run on Python? They should take initiative and start blogging.

kubav027

I am pretty happy with poetry for near future. I prefer using python interpreters installed by linux package manager. In cloud I use python docker. Poetry recently added option to install python too if I changed my mind.

I have already setup CI/CD pipelines for programs and python libraries. Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD. I do not think it is worth the time right now.

But if you use older environments without proper lock file I would recommend switching immediately. Poetry v2 supports pyproject.toml close to format used by uv so I can switch anytime when it would look more appealing.

Another thing to consider in long term is how astral tooling would change when they will need to make money.

js2

> I prefer using python interpreters installed by linux package manager.

uv will defer to any python it finds in PATH as long as it satisfies your version requirements (if any):

https://docs.astral.sh/uv/concepts/python-versions/

It also respects any virtual environment you've already created, so you can also do something like this:

   /usr/bin/python3 -m venv .venv
   .venv/bin/pip install uv
   .venv/bin/uv install -r requirements.txt # or
   .venv/bin/uv run script ...
It's a very flexible and well thought out tool and somehow it manages to do what I think it ought to do. I rarely need to go to its documentation.

> Using uv would probably save some time on dependency updates but it would require changing my workflow and CI/CD.

I found it very straightforward to switch to uv. It accommodates most existing workflows.

irjustin

I'm pretty much with you and still trying to figure out why I want to switch away from pyenv+poetry.

I get that uv does both, but I'm very happy with pyenv+poetry combo.

Old baggage, but I came from the rvm world which attempted to do exactly what uv does, but rvm was an absolute mess in 2013. rbenv+bundler solved so many problems for me and the experience was so bad that when I saw uv my gut reaction was to say "never again".

But this thread has so many praises for it so one day maybe i'll give it a try.

armanckeser

Uv dependency solving is light years faster than poetry. If you are working on actual projects with many dependencies, poetry is a liability

kubav027

You are right that it is faster but how often you are running dependency update? It will take more time to ensure that new dependencies did not break anything than doing upgrade.

WhyNotHugo

Yeah, using the package manager is the logical choice and usually the most likely one to work.

IIRC, uv downloads dynamically linked builds of Python, which may or may not work depending on your distribution and whether linked libraries are locally available or not. Not sure if things have changed in recent times.

kylecordes

UV is such a big improvement that it moves Python from my "would use again if I had to, but would really not look forward to it" pile to my "happy to use this as needed" pile. Without disparaging the hard work by many that came before, UV shows just how much previous tools left unsolved.

crabbone

It doesn't do anything differently beside the speed... Why do people keep praising it so much? It doesn't solve any of the real problems... come on. The problems weren't the tools, the problems are the bad design of the imports and packaging systems which cannot be addressed by an external tool: the language needs to change.

ptx

What are the design problems with the imports and packaging systems? How do they need to change?

TheIronYuppie

For scripting... HIGHLY recommend putting your dependencies inline.

E.g.:

  #!/usr/bin/env python3
  # /// script
  # requires-python = ">=3.11"
  # dependencies = [
  #     "psycopg2-binary",
  #     "pyyaml",
  # ]
  # ///
Then -

  uv run -s file.py

maleldil

How does this interact with your code editor or IDE? When you edit the file, where does the editor look for information about the imported third-party libraries?

AlphaSite

Usually the VENV and import lines are enough

maleldil

How do you determine where the venv is? AFAIK, uv run in script mode creates the venv in some random temporary directory.

marcthe12

Do you need a wrapper script for scripts in the PATH or execve? I would usualy chmod+x the script but I am not sure here.

Manfred

If you want to make it work regardless of where uv is installed, you can use the following shebang line:

  #!/usr/bin/env uv run --script

JimDabell

Discussed here:

> Using uv as your shebang line

https://news.ycombinator.com/item?id=42855258

Since `env` doesn’t pass multiple arguments by default, the suggested line uses `-S`:

   #!/usr/bin/env -S uv run --script

tetha

Not at a laptop to try this right now, but shouldn't this be possible with the shebang? Something along the lines of:

    #!/home/tetha/Tools/uv run

dfinninger

Yes it is, I just converted my work scripts over this afternoon.

    #!/usr/bin/env uv run

runjake

For my use cases, uv is so frictionless it has effectively made Python tolerable for me. I primarily discovered it via Simon Willison's (@simonw) blog posts[1]. I recommend his blog highly.

1. https://simonwillison.net/tags/uv/

vslira

I'm using exclusively uv for personal projects - and small prototypes at work - and I can't recommend it enough.

Uv makes python go from "batteries included" to "attached to a nuclear reactor"

scratchyone

i’ve started slipping uv into production work projects along with an auto generated requirements.txt for anyone who doesn’t wanna use uv. hoping i can drive adoption on my team while still leaving an alternative for people who don’t wanna use it

globular-toast

You mean `uv pip compile pyproject.toml > requirements.txt`?

selimnairb

I have been using Python for 20 years, and have been an intermediate to advanced user of it for last 5-7 years. I use it mostly for scientific computing (so lots of Numpy, SciPy, etc.), IoT data processing, and also for some microservices that don’t need to be super fast. I publish and maintain a few packages in PyPI and conda (though I almost never use conda myself), including a C++ library with Python bindings generated by SWIG (SWIG wouldn’t be my first choice, but I inherited it).

In what I’ve done, I’ve never found things like pipenv, let alone uv, to be necessary. Am I missing something? What would uv get?

crabbone

If you need to package for Anaconda, uv has nothing to offer you. It's a replacement for a number of PyPA tools, so it's not compatible with Anaconda tools.

The selling point of uv is that it does things faster than the tools it aims to replace, but on a conceptual level it doesn't add anything substantially new. The tools it aims to replace were borne of the defects in Python import and packaging systems (something that Anaconda also tried to address, but failed). They are not good tools designed to do things the right way. They are band-aids designed to mitigate some of the more common problems stemming from the bad design choices in the imports and packaging systems.

My personal problem with tools like uv is that, just like Web browsers in the early days of the Web tried to win users by tolerating the mistakes made by the Web site authors, it allows to delay the solution of the essential problems that exist in Python infrastructure by offering some pain relief to those who are using the band-aid tools.

mrbonner

Ans you can now install Python and set it to the default in your path with the --default flag. Big plus for me to replace pyenv finally.

thefreeman

finally! this was the thing keeping me from switching every time i looked into it.

BiteCode_dev

Note that despite the title, the author is not switching from pyenv to uv, but from pip, pyenv, pipx, pip-tools, and pipdeptree to uv, because uv does much more than pyenv alone.

It replaces a whole stack, and does each feature better, faster, with fewer modes of failure.

rsyring

15 year Python dev who usually adopts tooling slowly. Just do it, uv's absolutely worth it.

I also use mise with it, which is a great combination and gives you automatic venv activation among other things.

See, among other mise docs related to Python, https://mise.jdx.dev/mise-cookbook/python.html

See also a Python project template I maintain built on mise + uv: https://github.com/level12/coppy

jdxcode

ideally mise could be replaced entirely by uv or at least just be a thin wrapper around uv (in some ways that's already the case), but given this article requires the use of the custom uv-python-symlink utility it seems uv isn't quite there yet

rsyring

Mise does way more than uv, it's a much larger scope than just Python tooling.

I think the current status quo, that of mise utilizing uv for it's Python integration support, makes sense and I don't see that changing.

Also, FWIW, mise has other methods for Python integration support, e.g. pyenv, virtualenv, etc.

Edit:

Ha... Didn't realize who I was replying to. Don't need me to tell you anything about mise. I apparently misinterpreted your comment.

jdxcode

the reality that I'm sure you've heard me say many times is that I'm just not a python dev and astral is likely always going to build a better solution around python than I ever could. They've just focused a lot more on the package manager side of things than the runtime/venv management side so far but I suspect that will change—and given astral's velocity I doubt we'll be waiting long

and btw mise's venv support isn't going anywhere probably ever, but I do hope that at some point we could either let uv do the heavy lifting internally or point users to uv as a better solution

NeutralForest

I used to install Python through mise but now I just use uv tbh.

rsyring

Similar. But we get other benefits through mise, like tasks and other tool installs (e.g. Terraform). So we still use them together.

NeutralForest

That's fair, it's also nice if you have a backend in Python and a frontend in JS since mise also handles node.

jillesvangurp

I dabble with python occasionally and I'm always fighting with tools and tool combinations that don't really combine well. The last time I settled on using conda to get some isolation of python versions and then pipenv for getting some sane package management with a lock file. Not pretty but it kind of worked. Except I had a hard time convincing vs code and pycharm of the correct environment with that combination (couldn't resolve libraries I installed). I got it working eventually but it wasn't a great experience.

It sounds like uv should replace the combination. Of course there is the risk of this being another case of the python community ritually moving the problem every few years without properly solving it. But it sounds like uv is mostly doing the right thing; which is making global package installation the exception rather than the default. Most stuff you install should be for the project only unless you tell it otherwise.

Will give this a try next time I need to do some python stuff.

fnands

Do. I was sceptical at first - exactly because of the points you make: I mostly do ML, so for getting PyTorch and Cuda etc. to play nice conda was basically my go-to.

We use poetry at work, but getting it to play nice with PyTorch is always a bit of an art. I tried to get into Pixi, but have been a little annoyed as it seems to have inherited conda's issues when mixing conda and PyPi.

uv so far has been relatively smooth sailing, and they even have an entire section on using it with PyTorch: https://docs.astral.sh/uv/guides/integration/pytorch/