Skip to content(if available)orjump to list(if available)

Using uv and PEP 723 for Self-Contained Python Scripts

ratorx

Slightly off-topic, but the fact that this script even needs a package manager in a language with a standard library as large as Python is pretty shocking. Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

And I’m not blaming the author, the standard library docs even recommend using a third party library (albeit not the one the author is using) on the closest equivalent (urllib.request)!

> The Requests package is recommended for a higher-level HTTP client interface.

Especially for a language that has not cared too much about backwards compatibility historically, having an ergonomic HTTP client seems like table stakes.

diggan

> Making an HTTP request js pretty basic stuff for a scripting language, you shouldn’t need or want a library for it.

Sometimes languages/runtimes move slowly :) Speaking as a JS developer, this is how we made requests for a long time (before .fetch), inside the browser which is basically made for making requests:

    var xhr = new XMLHttpRequest();
    xhr.open('POST', 'https://example.com', true);
    xhr.setRequestHeader('Content-type', 'application/x-www-form-urlencoded');
    xhr.onload = function () {
        console.log(this.responseText);
    };
    xhr.send('param=add_comment');
Of course, we quickly wanted a library for it, most of us ended up using jQuery.get() et al before it wasn't comfortable up until .fetch appeared (or various npm libraries, if you were an early nodejs adopter)

d4mi3n

This takes me back. I'm glad `fetch` has become the canonical way to do this. XHR was a new capability at the time, but back then we were just starting to learn about all the nasty things people could do by maliciously issuing XHR requests and/or loading random executables onto the page. Clickjacking was all the rage and nothing equivalent to Content Security Policy existed at the time.

ratorx

I don’t think it’s just slowness or stability. The original release of requests was in 2011 and the standard library module (urllib.request) was added in Python 3.3 in 2012.

masklinn

It’s way older than that. It used to live in urllib2, which dates back to at least Python 2.1, released in April 2001.

gorgoiler

It has two! — http.client and urllib.request — and they are really usable.

Lots of people just like requests though as an alternative, or for historical reasons, or because of some particular aspect of its ergonomics, or to have a feature they’d rather have implemented for them than have to write in their own calling code.

At this stage it’s like using jQuery just to find an element by css selector (instead of just using document.querySelector.)

WesolyKubeczek

Requests used to have good PR back in the day and ended up entrenched as a transitive dependency for a lot of things. Because it’s for humans, right?

But recently I had to do something using raw urllib3, and you know what? It’s just as ergonomic.

masklinn

That’s pretty much irrelevant given urllib3 is a third party dependency as well.

ratorx

Sure historical popularity is a good reason for people who are already familiar with it to keep using it.

That is not really an excuse for why the standard library docs for the clients you mentioned link to requests though (especially if they were actually good, rather than just being legacy). If they really were good, why would the standard library itself suggest something else?

imtringued

They could have used a database driver for msql, postgresql or mongodb for a more realistic example (very common for sysadmin type scripts that are only used once and then thrown away) and your complaint would be invalid, but then you'd have to set up the database and the example would no longer be fit for a quick blog post that gives you the opportunity to just copy paste the code and run it for yourself.

ratorx

Well, the “this script needs package manager part”. The rest of my comment about the state of the HTTP client in Python would still be valid (but I probably wouldn’t have discovered it).

mkesper

The standard library does not give you a possibility to do async HTTP requests, that's what httpx does. As Python still heavily relies on async this is really a bummer.

masklinn

There’s absolutely no need for async http here. The script does one http request at the top of main. And a trivial one too (just a simple GET).

    response = urlopen(url)
    return json.load(response)
is what they’re saving themselves from.

fullstop

In that case, sure, but if you have an entire async framework you don't want that blocking call.

For as much as Python is embracing async / coroutines, I'm surprised that their http functions do not support it yet.

gabrielsroka

I talked about that in my readme https://github.com/gabrielsroka/r

masklinn

requests is really useful for non-trivial http requests (especially as urllib has terrible defaults around REST style interactions).

But here all the script is going is a trivial GET, that’s

    urllib.request.urlopen(url)

null

[deleted]

jgalt212

I agree. Requests is an embarrassment and indictment of the Python standard library. And so are dataclasses. They just should have subsumed attrs.

frfl

Anyone use PEP 723 + uv with an LSP based editor? What's your workflow? I looked into it briefly, the only thing I saw after a lot of digging around was to use `uv sync --script <script file>` and get the venv from the output of this command, activate that venv or specify it in your editor. Is there any other way, what I describe above seems a bit hacky since `sync` isn't meant to provide the venv path specifically, it just happens to display it.

Edit: I posted this comment before reading the article. Just read it now and I see that the author also kinda had a similar question. But I guess the author didn't happen to find the same workaround as I mention using the `sync` output. If the author sees this, maybe they can update the article if it's helpful to mention what I wrote above.

JimDabell

uv v0.6.10 has just been released with a more convenient way of doing this:

    uv python find --script foo.py
https://github.com/astral-sh/uv/releases/tag/0.6.10

https://docs.astral.sh/uv/reference/cli/#uv-python-find--scr...

BoumTAC

How does it work ? How does it find the environment ?

Let say I have a project in `/home/boumtac/dev/myproject` with the venv inside.

If I run `uv python find --script /home/boumtac/dev/myproject/my_task.py`, will it find the venv ?

JimDabell

The philosophy of uv is that the venv is ephemeral; creating a new venv should be fast enough that you can do it on demand.

Do you have a standalone script or do you have a project? --script is for standalone scripts. You don’t use it with projects.

If you tell it to run a standalone script, it will construct the venv itself on the fly in $XDG_CACHE_HOME.

If you have a project, then it will look in the .venv/ subdirectory by default and you can change this with the $UV_PROJECT_ENVIRONMENT environment variable. If it doesn’t find an environment where it is expecting to, it will construct one.

lynx97

Thanks! Came here to ask how to make pyright work with uv scripts...

pyright --pythonpath $(uv python find --script foo.py) foo.py

networked

My general solution to project management problems with PEP 723 scripts is to develop the script as a regular Python application that has `pyproject.toml`. It lets you use all of your normal tooling. While I don't use an LSP-based editor, it makes things easy with Ruff and Pyright. I run my standard Poe the Poet (https://poethepoet.natn.io/) tasks for formatting, linting, and type checking as in any other project.

One drawback of this workflow is that by default, you duplicate the dependencies: you have them both in the PEP 723 script itself and `pyproject.toml`. I just switched a small server application from shiv (https://github.com/linkedin/shiv) to inline script metadata after a binary dependency broke the zipapp. I experimented with having `pyproject.toml` as the single source of truth for metadata in this project. I wrote the following code to embed the metadata in the script before it was deployed on the server. In a project that didn't already have a build and deploy step, you'd probably want to modify the PEP 723 script in place.

  #! /usr/bin/env python3
  # License: https://dbohdan.mit-license.org/@2025/license.txt
  
  import re
  import tomllib
  from pathlib import Path
  from string import Template
  
  import tomli_w
  
  DEPENDENCIES = "dependencies"
  PROJECT = "project"
  REQUIRES_PYTHON = "requires-python"
  
  DST = Path("bundle.py")
  PYPROJECT = Path("pyproject.toml")
  SRC = Path("main.py")
  
  BUNDLE = Template(
      """
  #! /usr/bin/env -S uv run --quiet --script
  # /// script
  $toml
  # ///
  
  $code
  """.strip()
  )
  
  
  def main() -> None:
      with PYPROJECT.open("rb") as f:
          pyproject = tomllib.load(f)
  
      toml = tomli_w.dumps(
          {
              DEPENDENCIES: pyproject[PROJECT][DEPENDENCIES],
              REQUIRES_PYTHON: pyproject[PROJECT][REQUIRES_PYTHON],
          },
          indent=2,
      )
  
      code = SRC.read_text()
      code = re.sub(r"^#![^\n]+\n", "", code)
  
      bundle = BUNDLE.substitute(
          toml="\n".join(f"# {line}" for line in toml.splitlines()),
          code=code,
      )
  
      DST.write_text(bundle)
  
  
  if __name__ == "__main__":
      main()

zahlman

If you already have a pyproject.toml, and a "build and deploy step", why not just package normally? PEP 723 was developed for the part of the Python world that doesn't already live on PyPI (or a private package index).

networked

I probably should! My motivation early into the project was to try different ways to distribute a TUI app in Python and see how practical they were.

I started with the most self-contained, Nuitka. I quickly switched to building a zipapp with shiv because it was faster and cross-platform if you avoided binary dependencies. I wanted to be able to share a Python application with others easily, especially with technical users who weren't heavily into Python. PEP 723 added the ability for those hypothetical users to inspect the app and modify it lightly with minimum effort. But since I am still the sole user, I can just build a wheel and install it with `uv tool install` on the server.

skeledrew

I'm generally not a fan of the incremental rustification of the Python ecosystem, but I started using uv a few weeks ago just for this particular case and have been liking it. And to the point where I'm considering to migrate my full projects as well from their current conda+poetry flow. Just a couple days ago I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.

ratorx

Out of curiosity, what are some problems with rustification? Is it an aversion to Rust specifically or a dislike of the ecosystem tools not being written in Python?

The former is subjective, but the latter seems like not really much of an issue compared to the language itself being written in C.

zahlman

Speaking for myself:

I have no aversion to Rust (I've read some of it, and while foreign, it comes across as much more pleasant than C or C++), but the way it's promoted often is grating. I'm getting really tired in particular of how the speed of Rust is universally described as "blazing", and how "written in Rust" has a sparkle emoji as mandatory punctuation. But maybe that's just because I'm, well, older than Python itself.

I don't really care that the reference implementation isn't self-hosting (although it's nice that PyPy exists). Using non-Python for support (other than IDEs - I don't care about those and don't see a need to make more of them at all) is a bit grating in that it suggests a lack of confidence in the language.

But much more importantly, when people praise uv, they seem to attribute everything they like about it to either a) the fact that it's written in Rust or b) the fact that it's not written in Python, and in a lot of cases it just doesn't stand up to scrutiny.

uv in particular is just being compared to a low bar. Consider: `pip install` without specifying a package to install (which will just report an error that you need to specify a package) on my machine takes almost half a second to complete. (And an additional .2 seconds with `--python`.) In the process, it imports more than 500 modules. Seriously. (On Linux you can test it yourself by hacking the wrapper script. You'll have to split the main() call onto a separate line to check in between that and sys.exit().)

skeledrew

It's more the latter, particularly when Rust is used in libraries (eg. FastAPI) as opposed to tools, as it's destroying portability. For example I use flet[0] in some of my projects, and I have to be increasingly careful about the other dependencies as there is no support for the Rust toolchain within Dart/Flutter, and even if there was it still sounds like it'd be a nightmare to maintain. Same applies to any other platforms/apps out there that support running Python for flexibility, and handling another language is just way out of scope (and I'm pretty sure there are quite a few). A key part of Python's existence is as glue between disparate system parts, and rustification is reducing it's usefulness for an increasing number of niche cases where it once excelled.

[0] https://flet.dev

NeutralForest

I can understand the sentiment somewhat. It's another layer of complexity and it makes working on projects more difficult. The fact pip or mypy code is all Python makes it much easier to interact with and patch if needed.

You can also write Cython for more perf oriented code but I can totally understand the value Rust brings to the table, it's just now another language you'll need to know or learn, more layers like maturin or pyO3, while cffi is just there.

All the tooling coming from astral is amazing and I use it everyday but I can see the increasing complexity of our toolchains, not in ergonomics (it's much better now) but the tools themselves.

biorach

> he fact pip or mypy code is all Python makes it much easier to interact with and patch if needed

But how often in your career have you actually done this?

null

[deleted]

WesolyKubeczek

A problem with rustification is that it puts a giant ecosystem on a giant ecosystem, with poorly matched tooling. C has a lot of home ground advantage, and CPytjon is built on it.

Then you have PyPy which you’d have to accommodate somehow.

It doesn’t help that in a case where you have to build everything, Rust build toolchain currently needs Python. That sure would make bootstrapping a bitch if Python and Rust became a circular dependency of one another.

masklinn

> Then you have PyPy which you’d have to accommodate somehow.

Adding pypy support to a pyo3 + maturin project was literally just a matter of telling maturin to build that wheel. And I added graal while at it.

Hopefully they eventually add stable ABI support too so I don’t have to add individual pypy/graal wheel targets.

Or pyo3 and maturin may support hpy once that’s stable.

htunnicliff

    I also modified a script I've been using for a few years to patch pylsp so it can now see uv script envs using the "uv sync --dry-run --script <path>" hack.
This sounds like a really useful modification to the LSP for Python. Would you be willing to share more about how you patched it and how you use it in an IDE?

skeledrew

I have a somewhat particular setup where I use conda to manage my envs, and autoenv[0] to ensure the env for a given project is active once I'm in the folder structure. So there's a .env file containing "conda activate <env_name>" in each. I also use Emacs as my sole IDE, but there are quite a few instances where support falls short for modern workflows. I use the pylsp language server, and it's only able to provide completions, etc for native libraries, since by default it doesn't know how to find the envs containing extra 3p packages.

And so I wrote a patcher[1] that searches the project folder and parents until it finds an appropriate .env file, and uses it to resolve the path to the project's env. With the latest changes to the patcher it now uses the output from "uv sync", which is the path to a standalone script's env, as well as the traditional "source venv_path/bin/activate" pattern to resolve envs for uv-managed projects.

[0] https://github.com/hyperupcall/autoenv [1] https://gitlab.com/-/snippets/2279333

oulipo

what's the --dry-run hack ?

skeledrew

Using "--dry-run" makes the command a no-op, but still prints the env path.

alkh

Bonus points for "Bonus: where does uv install its virtual environments?" section! I was wondering the same question for a long time but haven't had a chance to dig in. It's great that venv is not being recreated unless any dependencies or Python version got modified

thisdavej

Thanks for the positive feedback! I was curious too and thought others would enjoy hearing what I learned.

sorenjan

You can also run `uv cache dir` to show the location.

stereo

I used to have a virtual environment for all little scrappy scripts, which would contain libraries I use often like requests, rich, or pandas. I now exclusively use this type of shebang and dependency declaration. It also makes runnings throwaway chatgpt scripts a lot easier, especially if you put PEP-723 instructions in your custom prompt.

__float

This was discussed somewhat recently in https://news.ycombinator.com/item?id=42855258

sireat

This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Now anyone you give your script to has to install uv first.

the_mitsuhiko

> This is neat writeup on use of uv, but it doesn't solve the "how to give self contained script to grandma" problem.

Not at the moment, but will your grandma run a script? There is an interesting thing you can already do today for larger applications which is to install uv alongside your app. You can make a curl-to-bash thing or similar that first installs uv into a program specific location to then use that to bootstrap your program. Is it a good idea? I don't know, but you can do that.

alanfranz

For simple scripts (I never succeeded using it on something really complex, but it's great when you don't want to use bash but need something like Python) I had used this approach that still works nowadays and has no uv dependency (only requires pip to be installed in the same Python interpreter that you're using to run your script):

https://www.franzoni.eu/single-file-editable-python-scripts-...

renewiltord

You can write a bash shebang that curl into shell. Unfortunately when I did it and gave to grandma it has failed because grandma has substituted oil shell and linked it as sh which is not best practice. I think grandmother shell script is simply impossible. They have spent decades acquiring idiosyncratic unix environment

ElectricalUnion

Good thing those days init is systemd instead of a series of de jure POSIX shell scripts but de facto bashism shell scripts that will fail to boot if you swap /bin/sh away from bash.

At least ubuntu helped force the ecosystem to at least pretend to support /bin/dash too.

denzil

For this case, it might be easier to package the script using pyinstaller. That way, she can just run it. Packaging it that way is more work on your side though.

oezi

I think uv should become a default package for most operating systems.

globular-toast

It automatically downloads interpreters from some internet source. It's a security nightmare. It can be configured not to do that but it's not the default.

benrutter

I'm not sure that's fair. It downloads standalone builds which astral themselves maintain. I'd say they're pretty trust-worthy.

If you're worried about installing code from internet sources, which I think is valid, then pip/uv/package-managers-in-general open cans of worms anyway.

rented_mule

It's a package manager. The job of package managers is to download code that you then run. That certainly has security implications, but that doesn't differentiate uv from pip, Poetry, Cargo, CPAN, npm, RubyGems, ...

oulipo

well you can just give her the `./install_uv.sh && ./run_script.sh` command, eg

`( curl -LsSf https://astral.sh/uv/install.sh | sh ) && ./run_script.sh`

imtringued

Next up: uv competitor compiled with cosmopolitan libc.

sorenjan

You don't need to run the script as `py wordlookup.py` or make a batch file `wordlookup.cmd` in Windows.

The standard Python installation in Windows installs the py launcher and sets it as the default file handler for .py (and .pyw). So if you try to run `wordlookup.py` Windows will let the py launcher handle it. You can check this with `ftype | find "Python"` or look in the registry.

You can make it even easier that that though. If you add .py to the PATHEXT environment variable you can run .py files without typing the .py extension, just like .exe and .bat.

silvanocerza

Albeit uv is amazing this not a unique feature of the project.

Hatch has this feature since a year or so too. https://hatch.pypa.io/latest/how-to/run/python-scripts/

smitty1e

As mentioned in the article, along with PDM.

ivh

I have also been switching to uv recently, frequently with --script, and I love it. What I havn't yet figured out though is how to integratge it with VScode's debugger to run the script with F5. It seems to insists on running what it thinks is the right python, not respecting the shebang.

alanfranz

Shameless plug for an old approach I use for various scripts when I think bash is not enough:

https://www.franzoni.eu/single-file-editable-python-scripts-...

This doesn't require UV, just pip within the same interpreter, but I wouldn't use it for something big, and still requires deps to be updated every now and then ofc (I never tried with raw deps, I always pin dependencies).

networked

Oh hey, I have seen your post. Making a script download the dependencies on its own is an interesting challenge. I am a big fan of inline script metadata in Python, and I was an early adopter when pipx implemented PEP 722 (the precursor to PEP 723), but I made my version for fun.

https://pip.wtf/ was on HN not that long ago (https://news.ycombinator.com/item?id=38383635). I had my own take on it that used virtual environments, supported Windows, and was d a free license: https://github.com/dbohdan/pip-wtenv.

networked

* was under a free license

cjs_ac

Speaking as someone who writes Python code for a living, I like the language, but I consider the ecosystem dire. No one seems able to propose a solution to the problem of 'how do I call someone else's code?' that isn't yelling 'MOAR PACKAGE MANAGERS' in their best Jeremy Clarkson impression.

I have no idea how any of it works and I see no point in learning any of it because by the time I've worked it out, it'll all have changed anyway.

At work, there are plenty of nutjobs who seem to enjoy this bullshit, and as long as following the instructions in the documentation allow me to get the codebase running on my machine, I don't have to deal with any of it.

At home, I refuse to use any Python package that isn't in the Debian repositories. Sure, it's all 'out of date', but if your package pushes breaking changes every fortnight, I'm not interested in using it anyway.

If people are still talking about how great uv is in five years' time, maybe I'll give it a go then.

IshKebab

I totally agree, but uv is the real deal. It's not another Poetry, Pipenv, etc.

uv takes Python infra from "jesus this is awful" to "wow this is actually quite nice". It is a game changer.

You should really try it now. Waiting 5 years is just needless self-flagellation.

IMO the only risk is that Astral run out of money, but given how dire the situation without uv is, I'd say it's easily worth the risk.

imtringued

The python ecosystem will catch up. Before Bambu Lab a lot of 3D printer companies produced garbage printers and after Bambu Lab every 3d printer company has almost 1:1 copied their printers, implying that they were selling garbage all these years, because they have no trouble catching up with Bambu Lab the moment they had to (to stay relevant).

stickfu

Not worth trying to drag folks with this mindset into the future. The way I see his workflow(and I do get it, I’m stubborn with some financial stuff), is same way he sees using uv and other new stuff. I agree uv is real deal and will be around for awhile. It has totally reignited my love of writing python. I will say, the love of uv on hacker news has surprised me. I was expecting a lot more replies like theirs.

IshKebab

Yeah me too. HN tends to be quite stuck-in-the-mud heavy (e.g. the you often see this in discussions around Rust).

Tbf I kind of understand his point of view - there have been many many failed attempts to fix Python tooling and it's easy to expect uv to be just another failed attempt.

I think it says a lot about just how bad the situation before uv was that even HN is positive about it.

nomel

[dead]

Terretta

Speaking as someone who enjoys reading after dark, I like lanterns, but I consider the ecosystem dire. No one seems able to propose a solution to the problem of how do I keep this lantern lit all night without soot and fuel on my hands. At home I refuse to try any fuel that I can't get from the meatpacker's leftovers anyway. If people are still talking about electricity and bulbs in five year's time, maybe I'll give it a go then.

VagabundoP

I've used plenty, but uv is basically a one stop shop with a logical workflow.

It has sane defaults so really I'd recommend most people just use it for everything, unless they some very specific reasons not to.