Switching Pip to Uv in a Dockerized Flask / Django App
135 comments
·June 24, 2025j4mie
slau
uv and its flexibility is an a absolute marvel. Where pip took 10 minutes, uv can handle it in 20-30s.
ljm
It’s an absolute godsend. I thought poetry was a nice improvement but it had its flaws as well (constant merge conflicts in the lock file in particular).
Uv works more or less the same as I’m used to with other tooling in Ruby, JS, Rust, etc.
smeeth
+1, this is the exact reason I started using uv. Extremely convenient.
For some reason uv pip has been very slow, however. Unsure why, might be my org doing weird network stuff.
greenavocado
Or very difficult package spec
politelemon
Doesn't it store the python version in the pyproject.toml though, is the python version file needed?
JimDabell
It’s not:
> uv will respect Python requirements defined in requires-python in the pyproject.toml file during project command invocations. The first Python version that is compatible with the requirement will be used, unless a version is otherwise requested, e.g., via a .python-version file or the --python flag.
— https://docs.astral.sh/uv/concepts/python-versions/#project-...
politelemon
cheers
gchamonlive
# Ensure we always have an up to date lock file.
if ! test -f uv.lock || ! uv lock --check 2>/dev/null; then
uv lock
fi
Doesn't this defeat the purpose of having a lock file? If it doesn't exist or if it's invalid something catastrophic happened to the lock file and it should be handled by someone familiar with the project. Otherwise, why have a lock file at all? The CI will silently replace the lock file and cause potential confusion.nickjj
Hi author here.
If you end up with an invalid lock file, it doesn't silently fail and move on with a generated lock file.
The --check flag ensures it's valid and up to date. If it's not then the command fails on the spot and since `set -o errexit` is set in the shell script, all execution stops.
For example, I made my lock file invalid by manually switching one of the dependencies to a version that doesn't match the expected SHA.
Then I ran the same script you partially quoted and it yields this error which blocks the build and gives a meaningful message that a human can react to:
1.712 Using CPython 3.13.3 interpreter at: /usr/local/bin/python3
1.716 error: Failed to parse `uv.lock`
1.716 Caused by: The entry for package `amqp` v5.3.4 has wheel `amqp-5.3.1-py3-none-any.whl` with inconsistent version: v5.3.1
------
failed to solve: process "/bin/sh -c chmod 0755 bin/* && bin/uv-install" did not complete successfully: exit code: 2
As for a missing lock file, yep it will generate one but we want that. The expectation there is we have nothing to base things off of, so let's generate a fresh one and use it moving forward. The human expectation in a majority of the cases is to generate one in this spot.remram
Yes this is a major bug in the process. I came to the comments to say this as well.
They say this but do the exact opposite as you point out:
> The --frozen flag ensures the lock file doesn’t get updated. That’s exactly what we want because we expect the lock file to have a complete list of exact versions we want to use for all dependencies that get installed.
silvester23
This is actually covered by the --locked option that uv sync provides.
If you do `uv sync --locked` it will not succeed if the lock file does not exist or is out of date.
Edit: I slightly misread your comment. I strongly agree that having no lock file or a lockfile that does not match your specified dependencies is a case where a human should intervene. That's why I suggest you should always use the --locked option in your build.
freetonik
In the Python world, I often see lockfiles treated a one "weird step in the installation process", and not committed to version control.
slau
In my experience, this is fundamentally untrue. pip-tools has extensive support for recording the explicit version numbers, package hashes and whatnot directly in the requirements.txt based on requirements.in and constraints files.
There are many projects that use pip-compile to lock things down. You couldn’t use python in a regulated environment if you didn’t. I’ve written many Makefiles that explicitly forbid CI from ever creating or updating the actual requirements.txt. It has to be reviewed by a human, or more.
MrJohz
There are lots of tools that allow you to generate what are essentially lock files. But I think what the previous poster is saying is that most people either don't use these tools or don't use them correctly. That certainly matches my experience, where I've seen some quite complicated projects get put into production without any sort of dependency locking whatsoever - and where I've also seen the consequences of that where random dependencies have upgraded and broken everything and it's been almost impossible to figure out why.
To me, one of the big advantages of UV (and similar tools) is that they make locked dependencies the default, rather than something you need to learn about and opt into. These sorts of better defaults are sorely needed in the Python ecosystem.
Hasnep
They're not saying that's how it's supposed to be used, they're saying that's how it's often used by people who are unfamiliar with lock files
burnt-resistor
In the almost every world, Ruby and elsewhere too, constraints in library package metadata are supposed to express the full supported possibilities of allowed constraints while lock files represent current specific state. That's why they're not committed in that case to allow greater flexibility/interoperability for downstream users.
For applications, it's recommended (but still optional) to commit lock files so that very specific and consistent dependencies are maintained to prevent arbitrary, unsupervised package upgrades leading to breakage.
MrJohz
I know Cargo recommended your approach for a while, but ended up recommending that all projects always check in a lock file. This is also the norm in most other ecosystems I've used including Javascript and other Python package managers.
When you're developing a library, you still want consistent, reproducible dependency installs. You don't want, for example, a random upgrade to a testing library to break your CI pipelines or cause delays while releasing. So you check in the lock file for the people working on the library.
But when someone installs the library via a package manager, that package manager will ignore the lock file and just use the constraints in the package metadata. This avoids any interoperability issues for downstream users.
I've heard of setups where there are even multiple lock files checked in so different combinations of dependency can be tested in CI, but I've not seen that in practice, and I imagine it's very much dependent on how the ecosystem as a whole operates.
bckr
This is kinda how I treat it. I figured that I have already set the requirements in the pyproject.toml file.
Should I be committing the lock file?
gcarvalho
If your pyproject.toml does not list all your dependencies (including dependencies of your dependencies) and a fixed version for each, you may get different versions of the dependencies in future installs.
A lock file ensures all installations resolve the same versions, and the environment doesn’t differ simply because installations were made on different dates. Which is usually what you want for an application running in production.
oceansky
It's what I used to do with package-lock.json when I had little production experience.
9dev
What are the possible remediation steps, however? If there is no lock file at all, this is likely the first run, or it will be overwritten from a git upstream later on anyway; if it's broken, chances are high someone messed up a package installation and creating a fresh lock file seems like the only sensible thing to do.
I also feel like this handles rare edge cases, but it seems like a pretty straightforward way to do so.
stavros
If there's no lock file at all, you haven't locked your dependencies, and you should just install whatever is current (don't create a lockfile). If it's broken, you have problems, and you need to abort the deploy.
There is never a reason for an automated system to create a lockfile.
ealexhudson
The reason is simple: it allows you to do the install using "sync" in all cases, whether the lockfile exists or not.
Where the lockfile doesn't exist, it creates it from whatever current is, and the lockfile then gets thrown away later. So it's equivalent to what you're saying, it just avoids having two completely separate install paths. I think it's the correct approach.
ufmace
IMO, this is the process for building an application image for deployment to production. If the lock file is not present, then the developer has done something wrong and the deployment should fail catastrophically because only manual intervention by the developer can fix it correctly.
JimDabell
If the lock file is missing the only sensible thing to do is require human intervention. Either it’s the unusual case of somebody initialising a project but never syncing it, or something has gone seriously wrong – with potential security implications. The upside to automating this is negligible and the downside is large.
guappa
? It has always been the case that if you don't specify a version, the latest is implied.
globular-toast
The fix is to generate the lockfile and commit it to the repository. Every build should be based on the untouched lockfile from the repo. It's the entire point of it.
ericfrederich
I am totally against Python tooling being written in a language other than Python. I get that C extensions exist and for the most part Python is synonymous with CPython.
I think 2 languages are enough, we don't need a 3rd one that nobody asked for.
I have nothing against Rust. If you want a new tool, go for it. If you want a re-write of an existing tool, go for it. I'm against it creeping into an existing eco-system for no reason.
A popular Python package called Pendulum went over 7 months without support for 3.13. I have to imagine this is because nobody in the Python community knew enough Rust to fix it. Had the native portion of Pendulum been written in C I would have fixed it myself.
https://github.com/python-pendulum/pendulum/issues/844
In my ideal world if someone wanted fast datetimes written in Rust (or any other language other than C) they'd write a proper library suitable for any language to consume over FFI.
So far this Rust stuff has left a bad taste in my mouth and I don't blame the Linux community for being resistant.
ufmace
I appreciate this perspective, but I think building a tool like uv in Rust is a good idea because it's a tool for managing Python stuff, not a tool to be called from within Python code.
Having your python management tools also be written in python creates a chicken-and-egg situation. Now you have to have a working python install before you can start your python management tool, which you are presumably using because it's superior to managing python stuff any other way. Then you get a bunch of extra complex questions like, what python version and specific executable is this management tool using? Is the actual code you're running using the same or a different one? How about the dependency tree? What's managing the required python packages for the installation that the management tool is running in? How do you know that the code you're running is using its own completely independent package environment? What happens if it isn't, and there's a conflict between a package or version your app needs and what the management tool needs? How do you debug and fix it if any of this stuff isn't actually working quite how you expected?
Having the management tool be a compiled binary you can just download and use, regardless of what language it was written in, blows up all of those tricky questions. Now the tool actually does manage everything about python usage on your system and you don't have to worry about using some separate toolchain to manage the tool itself and whether that tool potentially has any conflicts with the tool you actually wanted to use.
sgarland
Python is my favorite language, but I have fully embraced uv. It’s so easy, and so fast, that there is nothing else remotely close.
Need modern Python on an ancient server running with EOL’d distro that no one will touch for fear of breaking everything? uv.
Need a dependency or two for a small script, and don’t want to hassle with packaging to share it? uv.
That said, I do somewhat agree with your take on extensions. I have a side project I’ve been working on for some years, which started as pure Python. I used it as a way to teach myself Python’s slow spots, and how to work around them. Then I started writing the more intensive parts in C, and used ctypes to interface. Then I rewrote them using the Python API. I eventually wrote so much of it in C that I asked myself why I didn’t just write all of it in C, to which my answer was “because I’m not good enough at C to trust myself to not blow it up,” so now I’m slowly rewriting it in Rust, mostly to learn Rust. That was a long-winded way to say that I think if your external library functions start eclipsing the core Python code, that’s probably a sign you should write the entire thing in the other language.
moolcool
> I am totally against Python tooling being written in a language other than Python
I will be out enjoying the sunshine while you are waiting for your Pylint execution to finish
throwawaysleep
Linting is the new "compiling!"
carlhjerpe
Linting and type checking are very CPU intensive tasks so I would excuse anyone implementing those types of tools in $LANG where using all CPU juice matters.
I can't help but think uv is fast not because it's written in Rust but because it's a fast reimplementation. Dependency solving in the average Python project is hardly computationally expensive, it's just downloading and unpacking packages with a "global" package cache. I don't see why uv couldn't have been implemented in Python and be 95% as fast.
Edit: Except implementing uv in Python requires shipping a Python interpreter kinda defeating some of it's purpose of being a package manager able to install Python as well.
nonethewiser
You also have to factor in startup time and concurrency. Caching an SAT solvers can't get python to 95% of uv.
nonethewiser
>I am totally against Python tooling being written in a language other than Python. I get that C extensions exist and for the most part Python is synonymous with CPython.
>I think 2 languages are enough, we don't need a 3rd one that nobody asked for.
Enough for what? The uv users dont have to deal with that. Most ecosystems use a mix of language for tooling. It's not a detail the user of the tool has to worry about.
>I'm against it creeping into an existing eco-system for no reason.
It's much faster. Because its not written in Python.
The tooling is for the user. The language of the tooling is for the developer of the tooling. These dont need to be the same people.
The important thing is if the tool solves a real problem in the ecosystem (it does). Do people like it?
kzrdude
> I think 2 languages are enough, we don't need a 3rd one that nobody asked for.
Look at the number of stars ruff and uv got on github. That's a meteoric rise. So they were validated with ruff, and continued with uv, this we can call "was asked for".
> I'm against it creeping into an existing eco-system for no reason.
It's not no reason. A lot of other things have been tried. It's for big reasons: Good performance, and secondly independence from Python is a feature. When your python managing tool does not depend on Python itself, it simplifies some things.
Gabrys1
I, on the other hand, don't care what language the tools are written in.
I do get the sentiment that a user of these tools, being a Python developer could in theory contribute to them.
But, if a tool does its job, I don't care if it's not "in Python". Moreover, I imagine there is a class of problems with the Python environment setup that'd break the tool that could help you fix it if the tool itself is written in Python.
HelloNurse
It is well known, and not Python-specific, that using a different language/interpreter for development tools eliminates large classes of bootstrapping complications and conflicts.
If there are two versions of X, it becomes possible to use the wrong one.
If a tool to manage X depends on X, some of the changes that we would like the tool to perform are more difficult, imperfect or practically impossible.
greener_grass
Rust offers a feature-set that neither Python nor C has. If Rust is the right tool for the job, I would rather the code be written in Rust. Support has more to do with incentive structures than implementation language.
bodge5000
In theory, I can get behind what your saying, but in practice I just haven't found any package manager written in Python to be as good as uv, and I'm not even talking about speed. uv as I like it could be written in Python, but it hasn't been
RamblingCTO
I really dig rye, have you tried that?
gschizas
rye is also written in Rust and it's being replaced by uv.
From its homepage: https://rye.astral.sh/
> If you're getting started with Rye, consider uv, the successor project from the same maintainers.
> While Rye is actively maintained, uv offers a more stable and feature-complete experience, and is the recommended choice for new projects.
> Having trouble migrating? Let us know what's missing.
Kwpolska
It's also Rust.
ndr
PSA careful replacing `pip` with `uv` thinking it's a drop-in replacement.
By default `uv` won't generate `pyc` files which might make your service much slower to start.
See https://docs.astral.sh/uv/reference/settings/#pip_compile-by...
elyall
uv's guide for use in containers is a better reference for this: https://docs.astral.sh/uv/guides/integration/docker/#compili...
b0a04gl
been using uv on a flask container and honestly the diff in build times is just boringly huge. not even the speed tho, it's how predictable things get. no stupid “why did pip install this version” moments. you write a pyproject.toml, freeze with uv lock, done.
>In docker you can just raw COPY pyproject.toml uv.lock* . then run uv sync --frozen --no-install-project. this skips your own app so your install layer stays cacheable. real ones know how painful it is to rebuild entire layers just cuz one package changed.
>UV_PROJECT_ENVIRONMENT=/home/python/.local bypasses venv. which means base images can be pre-warmed or shared across builds. saves infra cost silently. just flip UV_COMPILE_BYTECODE=1 and get .pyc at build.
> It kills off mutable environments. forces you to respect reproducibility. if your build is broken, it's your lockfile's fault now. accountability becomes visible
ing33k
UV just works. Easily one of the best things to happen to Python packaging in years.
cmiller1
I have software people use up on pypi and I'd love to switch to uv for personal use to benefit from the improved speed but I'd need some guarantees that things work EXACTLY how they work on pip or that I could run them concurrently. If I put instructions up for users to "just run pip install xxx" I need to know that if they see any errors I can see those too for debugging/troubleshooting.
oblvious-earth
It does not work EXACTLY how pip works, big differences are covered here: https://docs.astral.sh/uv/pip/compatibility/
Some of these are uv following the standards while pip is still migrating away from legacy behavior, some of these are design choices that uv has made, because the standard is underdefined, it's a tool specific choice, or uv decided not to follow the standards for whatever reason.
0xblinq
2025 and python packaging and dependencies management is still a mess.
incognito124
It's only a mess because not everyone has adopted uv yet (IMO)
0xblinq
It’s a mess because there’s no first party, properly thought out and working solution.
So every year we get a new “new way” to do it. Like that xkcd… this time this is the standard that will work!
greener_grass
UV looks very promising but I can assure you, if everyone adopted it tomorrow we would see a long tail of scenarios that UV does not work well for.
tempest_
I am perfectly content for a 90% solution, we should not let the perfect be the enemy of the good.
bckr
I would love to see that happen and see how astral responds. I would love to see uv get built into Python 4
soulofmischief
Python package management is a classic example of xkcd #927, and the community cannot be blamed for developing a Pavlovian response when it comes to yet-another-package-manager that promises to be the final solution.
db48x
Yep. The lesson is to get this right early in your language design. Do not punt until version 2.0. Think twice before putting the package/module/whatever metadata in an executable script. If you do decide to do that, think a third time. It works out better for some languages (like Common Lisp) than others (like Python).
ikrenji
never had a problem with dependencies. how is it a mess? you have requirements.txt and venv per project. doesn't get easier than that
ericfrederich
Yes, it's a mess (New: now with Rust!)
jdboyd
What originally convinced me to try uv was the promise of faster container builds, and it certainly delivered on that.
As someone who usually used platform pythons, despite advise against that, uv is now what got me to finally stop doing so.
bsenftner
I'd like to see a security breakdown of uv versus pip versus conda versus whatever fashionable package manager I've not heard of yet.
Speed is okay, but security of a package manager is far more important.
Bengalilol
uv is generally more secure than pip. It resolves dependencies without executing arbitrary code, verifies package hashes by default, and avoids common risks like typosquatting and code execution during install. It's also faster and more reproducible.
https://chaitalks.tech/uv-a-modern-python-package-manager-in...
glaucon
I'd be interested to know under what circumstances pip executes arbitrary code while resolving dependencies ... how does that work ?
And while I'm here ... how does uv go about mitigating typosquatting risks ? I could imagine how it might issue warnings if you perhaps it notices you requesting "dlango", which would work OK for the top 10% but are you suggesting there's some more general solution built into uv ?
I did a quick search but 'typosquatting' is not an easy string to cut through.
db48x
To install a package and its dependencies, you need the list of dependencies. This metadata is not always statically available!
Python packages are often just a zip file full of py files, with one of them called 'setup.py'. Running this file installs the package (originally using [distutils](https://docs.python.org/3.9/install/index.html#install-index)). This installation may fail if dependencies are not present, but there’s no method provided for installing those dependencies. You’re supposed to read the error message, go download the source for the missing dependencies, then run their setup.py scripts to install them.
un_ess
a)"Thanks to backwards compatibility, a package offered only as a source distribution and with the legacy setup.py file for configuration and metadata specification will run the code in setup.py as part of the installation." https://blog.phylum.io/python-package-installation-attacks/
b) pip now has an option _not_ to run arbitrary code by disallowing source distributions, by passing --only-binary :all:
"By default, pip does not perform any checks to protect against remote tampering and involves running arbitrary code from distributions. It is, however, possible to use pip in a manner that changes these behaviours, to provide a more secure installation mechanism." https://pip.pypa.io/en/stable/topics/secure-installs/
alexchamberlain
For a source package based on setup tools, setup.py is executed with a minimal environment and can run arbitrary code.
diggan
> security breakdown of uv versus pip versus conda versus whatever fashionable package manager
In the end, every package manager (so far at least) download and runs untrusted (unless you've verified it manually) 3rd party code. Whatever the security difference is between uv and pip implementation-wise is dwarfed compared to if you haven't found a way of handling untrusted 3rd party code yet.
null
kh_hk
Just generate a requirements.txt with uv, ship that with docker, and then there's no need for all this dance
It's worth noting that uv also supports a workflow that directly replaces pyenv, virtualenv and pip without mandating a change to a lockfile/pyproject.toml approach.
uv python pin <version> will create a .python-version file in the current directory.
uv virtualenv will download the version of Python specified in your .python-version file (like pyenv install) and create a virtualenv in the current directory called .venv using that version of Python (like pyenv exec python -m venv .venv)
uv pip install -r requirements.txt will behave the same as .venv/bin/pip install -r requirements.txt.
uv run <command> will run the command in the virtualenv and will also expose any env vars specified in a .env file (although be careful of precedence issues: https://github.com/astral-sh/uv/issues/9465)