Fun with uv and PEP 723
119 comments
·June 24, 2025ACAVJW4H
traverseda
I don't think nix is that hard for this particular use case. Installing nix on other distros is pretty easy, and once it's installed you just do something like this
#! /usr/bin/env nix-shell
#! nix-shell -i bash -p imagemagick cowsay
# scale image by 50%
convert "$1" -scale 50% "$1.s50.jpg" &&
cowsay "done $1.q50.jpg"
Sure all of nixos and packaging for nix is a challenge, but just using it for a shell script is not too badwpm
I simply do not write shell scripts that use or reference binaries/libraries that are no pre-installed on the target OS (which is the correct target, writing shell scripts for portability is silly).
There is no package manager that is going to make a shell script I write for macOS work on Linux if that script uses commands that only exist on macOS.
nothrabannosir
Nix is overkill for any of the things it can do. Writing a simple portable script is no exception.
But: it’s the same skill set for every one of those things. This is why it’s an investment worth making IMO. If you’re only going to ever use it for one single thing, it’s not worth it. But once you’ve learned it you’ll be able to leverage it everywhere.
Python scripts with or without dependencies, uv or no uv (through the excellent uv2nix which I can’t plug enough, no affiliation), bash scripts with any dependencies you want, etc. suddenly it’s your choice and you can actually choose the right tool for the job.
Not trying to derail the thread but it feels germane in this context. All these little packaging problems go away with Nix, and are replaced by one single giant problem XD
ndr
Why bother writing new shell scripts?
If you're allowed to install any deps go with uv, it'll do the rest.
I'm also kinda in love with https://babashka.org/ check it out if you like Clojure.
bigstrat2003
> Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages.
IMO it should stay that way, because any script that needs those things is way past the point where shell is a reasonable choice. Shell scripts should be small, 20 lines or so. The language just plain sucks too much to make it worth using for anything bigger.
pxc
When you solve the dependency management issue for shell scripts, you can also use newer language features because you can ship a newer interpreter the same way you ship whatever external dependencies you have. You don't have to limit yourself to what is POSIX, etc. Depending on how you solve it, you may even be able to switch to a newer shell with a nicer language. (And doing so may solve it for you; since PowerShell, newer shells often come with a dependency management layer.)
> any script that needs those things
It's not really a matter of needing those things, necessarily. Once you have them, you're welcome to write scripts in a cleaner, more convenient way. For instance, all of my shell scripts used by colleagues at work just use GNU coreutils regardless of what platform they're on. Instead of worrying about differences in how sed behaves with certain flags, on different platforms, I simply write everything for GNU sed and it Just Works™. Do those scripts need such a thing? Not necessarily. Is it nicer to write free of constraints like that? Yes!
Same thing for just choosing commands with nicer interfaces, or more unified syntax... Use p7zip for handling all your archives so there's only one interface to think about. Make heavy use of `jq` (a great language) for dealing with structured data. Don't worry about reading input from a file and then writing back to it in the same pipeline; just throw in `sponge` from moreutils.
> The language just plain sucks too much
There really isn't anything better for invoking external programs. Everything else is way clunkier. Maybe that's okay, but when I've rewritten large-ish shell scripts in other languages, I often found myself annoyed with the new language. What used to be a 20-line shell script can easily end up being 400 lines in a "real" language.
I kind of agree with you, of course. POSIX-ish shells have too much syntax and at the same time not enough power. But what I really want is a better shell language, not to use some interpreted non-shell language in their place.
m2f2
Nice, if only you could count on having it installed on your fleet, and your fleet is 100pct Linux, no AIX, no HPUX, no SOLARIS, no SUSE on IBM Power....
Been there, tried to, got a huge slap in the face.
MatmaRex
Broke: Dependency management used for shell scripts
Woke: Dependency management used for installing an interpreter for a better programming language to write your script in it
Bespoke: Dependency management used for installing your script
wazzaps
Check out mise: https://mise.jdx.dev/
We use it at $work to manage dev envs and its much easier than Docker and Nix.
It also installs things in parallel, which is a huge bonus over plain Dockerfiles
fouronnes3
Consider porting your shell scripts to Python? The language is vastly superior and subprocess.check_call is not so bad.
password4321
I'm unable to resist responding that clearly the solution is to run Nix in Docker as your shell since packaging, dependency management, and reproducibility will be at theoretical maximum.
puika
Like the author, I find myself going more for cross-platform Python one-offs and personal scripts for both work and home and ditching Go. I just wish Python typechecking weren't the shitshow it is. Looking forward to ty, pyrefly, etc. to improve the situation a bit
SavioMak
Speed is one thing, the type system itself is another thing, you are basically guaranteed to hit like 5-10 issues with python's weird type system before you start grasping some of the oddities
epistasis
This is really great, and it seems that it's becoming more popular. I saw it first on simonw's blog:
https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
And there was a March discussion of a different blog post:
https://news.ycombinator.com/item?id=43500124
I hope this stays on the front page for a while to help publicize it.
gerdesj
I've recently updated a Python script that I originally wrote about 10 years ago. I'm not a programmer - I just have to get stuff done - think sysops.
For me there used to be a clear delineation between scripting languages and compiled languages. Python has always seemed to want to be both and I'm not too sure it can really. I can live with being mildly wrong about a concept.
When Python first came out, our processors were 80486 at best and RAM was measured in MB at roughly £30/MB in the UK.
"For the longest time, ..." - all distros have had scripts that find the relevant Python or Java or whatevs so that's simply daft. They all have shebang incantations too.
So we now have uv written in Rust for Python. Obviously you should install it via a shell script directly from curl!
I love all of the components involved here but please for the love of a nod to security at least suggest that the script is downloaded first, looked over and then run.
I recently came across a Github hosted repo with scripts that changed Debian repos to point somewhere else and install ... software. I'm sure that's all fine too.
curl | bash is cute and easy and very, very insecure.
mroche
My solution to this is:
1) Subscribe to the GitHub repo for tag/release updates.
2) When I get a notification of a new version, I run a shell function (meup-uv and meup-ruff) which grabs the latest tag via a GET request and runs an install. I don't remember the semantics off the top of my head, but it's something like:
cargo install --jobs $(( $(nproc) / 2 )) --tag ${TAG} --git ${REPO} [uv|ruff]
Of course this implies I'm willing to wait the ~5-10 minutes for these apps to compile, along with the storage costs of the registry and source caches. Build times for ruff aren't terrible, but uv is a straight up "kick off and take a coffee break" experience on my system (it gets 6-8 threads out of my 12 total depending on my mood).jkingsman
uv has been fantastic to use for little side projects. Combining uv run with `uv tool run` AKA `uvx` means one can fetch, install within a VM, and execute Python scripts from Github super easily. No git clone, no venv creation + entry + pip install.
And uv is fast — I mean REALLY fast. Fast to the point of suspecting something went wrong and silently errored, when it fact it did just what I wanted but 10x faster than pip.
It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheless.
lxgr
Truly. uv somehow resolves and installs dependencies more quickly than pyenv manages to print its own --help output.
mikepurvis
I know there are real reasons for slow Python startup time, with every new import having to examine swaths of filesystem paths to resolve itself, but it really is a noticeable breath of fresh air working with tools implemented in Go or Rust that have sub-ms startup.
lxgr
The Python startup latency thing makes sense, but I really don't understand why it would take `pyenv` a long time to print each line of its "usage" output (the one that appears when invoking it with `--help`) once it's already clearly in the code branch that does only that.
It feels like like it's doing heavy work between each line printed! I don't know any other cli tool doing that either.
Spivak
Not to derail the Python speed hate train but pyenv is written in bash.
It's a tool for installing different versions of Python, it would be weird for it to assume it already had one available.
theshrike79
The "slowness" and the utter insanity of trying to make a "works on my computer" Python program work on another computer pushed me to just rewrite all my Python stuff in Go.
About 95% of my Python utilities are now Go binaries cross-compiled to whatever env they're running in. The few remaining ones use (API) libraries that aren't available for Go or aren't mature enough for me to trust them yet.
heavyset_go
Last time I looked, pyenv contributors were considering implementing a compiled launcher for that reason.
But that ship has sailed for me and I'm a uv convert.
satvikpendem
Very nice, I believe Rust is doing something similar too which is where I initially learned of this idea of single-file shell-type scripts in other languages (with dependency management included, which is how it differs from existing ways of writing single-file scripts in e.g. scripting languages) [0].
Hopefully more languages follow suit on this pattern as it can be extremely useful for many cases, such as passing gists around, writing small programs which might otherwise be written in shell scripts, etc.
js2
So far I've only run into one minor ergonomic issue when using `uv run --script` with embedded metadata which is that sometimes I want to test changes to the script via the Python REPL, but that's a bit harder to do since you have to run something like:
$ uv run --python=3.13 --with-requirements <(uv export --script script.py) -- python
>>> from script import X
I'd love if there were something more ergonomic like: $ uv run --with-script script.py python
Edit: this is better: $ "$(uv python find --script script.py)"
>>> from script import X
That fires up the correct python and venv for the script. You probably have to run the script once to create it.dkdcio
I think you're looking for something like this (the important part being embeddeding a REPL call toward the end after whateve rsetup): https://gist.github.com/lostmygithubaccount/77d12d03894953bc...
You can make `--interactive` or whatever you want a CLI flag from the script. I often make these small Typer CLIs with something like that (or in this case, in another dev script like this, I have `--sql` for entering a DuckDB SQL repl)
kristianp
If you want to manually manage envs and you're using conda, you can activate the env in a shell wrapper for your python script, like so (this is with conda)
#!/usr/bin/env bash
eval "$(conda shell.bash hook)"
conda activate myenv
python myscript
Admittedly this isn't self contained like the PEP 723 solution.sambaumann
Between yesterday's thread and this thread I decided to finally give uv a shot today - I'm impressed, both by the speed and how easy it is to manage dependencies for a project.
I think their docs could use a little bit of work, especially there should be a defined path to switch from a requirements.txt based workflow to uv. Also I felt like it's a little confusing how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
tdhopper
I write an ebook on Python Developer tooling. I've attempted to address some of the weaknesses in the official documentation.
How to migrate from requirements.txt: https://pydevtools.com/handbook/how-to/migrate-requirements.... How to change the Python version of a uv project: https://pydevtools.com/handbook/how-to/how-to-change-the-pyt...
Let me know if there are other topics I can hit that would be helpful!
7thpower
This is wonderful. When I was learning I found the documentation inadequate and gpt4 ran in circles as I did not know what to ask (I did not realize “how do I use uv instead of conda/pip?” was a fundamentally flawed question).
wrboyce
This would’ve been really handy for me a few weeks ago when I ended up working this out for myself (not a huge job, but more effort than reading your documentation would’ve been). While I can’t think of anything missing off the top of my head, I do think a PR to uv to update the official docs would help a lot of folk!
Actually, I’ve thought of something! Migrating from poetry! It’s something I’ve been meaning to look at automating for a while now (I really don’t like poetry).
bormaj
This is a great resource, thank you for putting this together
oconnor663
> it's defined in both .python-version and pyproject.toml
The `requires-version` field in `pyproject.toml` defines a range of compatible versions, while `.python-version` defines the specific version you want to use for development. If you create a new project with uv init, they'll look similar (>=3.13 and 3.13 today), but over time `requires-version` usually lags behind `.python-version` and defines the minimum supported Python version for the project. `requires-version` also winds up in your package metadata and can affect your callers' dependency resolution, for example if your published v1 supports Python 3.[old] but your v2 does not.
zahlman
> how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
pyproject.toml is about allowing other developers, and end users, to use your code. When you share your code by packaging it for PyPI, a build backend (uv is not one, but they seem to be working on providing one - see https://github.com/astral-sh/uv/issues/3957 ) creates a distributable package, and pyproject.toml specifies what environment the user needs to have set up (dependencies and python version). It has nothing to do with uv in itself, and is an interoperable Python ecosystem standard. A range of versions is specified here, because other people should be able to use your code on multiple Python versions.
The .python-version file is used to tell uv specifically (i.e. nobody else) specifically (i.e., exact version) what to do when setting up your development environment.
(It's perfectly possible, of course, to just use an already-set-up environment.)
furyofantares
Same, although I think it doesn't support my idiosyncratic workflow. I have the same files sync'd (via dropbox at the moment) on all my computers, macos and windows and wsl alike, and I just treat every computer likes it's the same computer. I thought this might be a recipe for disaster when I started doing it years ago but I have never had problems.
Some stuff like npm or dotnet do need an npm update / dotnet restore when I switch platforms. At first attempt uv seems like it just doesn't really like this and takes a fair bit of work to clean it up when switching platforms, while using venvs was fine.
0cf8612b2e1e
I have never researched this, but I thought the .python-version file only exists to benefit other tools which may not have a full TOML parser.
zahlman
Read-only TOML support is in the standard library since Python 3.11, though. And it's based on an easily obtained third-party package (https://pypi.org/project/tomli/).
(If you want to write TOML, or do other advanced things such as preserving comments and exact structure from the original file, you'll want tomlkit instead. Note that it's much less performant.)
null
gschizas
> there should be a defined path to switch from a requirements.txt based workflow to uv
Try `uvx migrate-to-uv` (see https://pypi.org/project/migrate-to-uv/)
tpoacher
What's going on? This whole thread reads like paid amazon reviews
indosauros
What's going on is "we have 14 standards so we need to create a 15th" actually worked this time
oblio
Occasionally the reviews match reality.
null
divbzero
If momentum for uv in the community continues, I’d love to see it distributed more broadly. uv can already be installed easily on macOS via Homebrew (like pyenv). uv can also be installed on Windows via WinGet (unlike pyenv). It would be nice to see it packaged for Linux as well.
mixmastamyk
$ dnf search --cacheonly uv
Matched fields: name (exact)
uv.x86_64: An extremely fast Python package installer and resolver, written in Rust
tyrion
Some years ago I thought it would be interesting to develop a tool to make a python script automatically install its own dependencies (like uvx in the article), but without requiring any other external tool, except python itself, to be installed.
The downside is that there are a bunch of seemingly weird lines you have to paste at the begging of the script :D
If anyone is curios it's on pypi (pysolate).
dkdcio
Also this thing that never took off: https://github.com/fal-ai/isolate
Not quite the same but interesting!
finally feels like Python scripts can Just Work™ without a virtualenv scavenger hunt.
Now if only someone could do the same for shell scripts. Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages. Right now it’s still curl | bash and hope for the best, or a README with 12 manual steps and three missing dependencies.
Sure, there’s Nix... if you’ve already transcended time, space, and the Nix manual. Docker? Great, if downloading a Linux distro to run sed sounds reasonable.
There’s got to be a middle ground simple, declarative, and built for humans.