Author | Nejat Hakan |
nejat.hakan@outlook.de | |
PayPal Me | https://paypal.me/nejathakan |
Package Manager uv
Introduction uv
Welcome to this comprehensive guide on uv
, a relatively new and extremely fast Python package installer and virtual environment manager developed by Astral, the same organization behind the popular linter Ruff
. Designed as a potential successor or alternative to tools like pip
, venv
, pip-tools
, and virtualenv
, uv
aims to provide a significantly faster and more cohesive experience for Python developers.
Built in Rust, uv
leverages modern asynchronous I/O and advanced caching strategies to achieve dramatic speed improvements, often being orders of magnitude faster than traditional tools, especially on projects with many dependencies or when operating on cached packages. It serves as a single binary that can both manage your project's virtual environments and handle the installation, updating, and removal of packages.
For Linux users, who often work in environments where performance and efficient resource utilization are paramount, uv
offers a compelling proposition. Its speed can drastically reduce waiting times during development, testing, and deployment cycles. Furthermore, its compatibility with existing requirements.txt
files and the standard venv
structure (while providing its own creator) ensures a relatively smooth transition for those familiar with the established Python tooling ecosystem.
This guide will take you from the absolute basics of installing and using uv
to more advanced concepts like its caching mechanisms, configuration options, and integration with modern Python project standards like pyproject.toml
. Each section builds upon the previous one, and practical workshops are included to help solidify your understanding through hands-on experience in a Linux environment. We assume you have a basic understanding of Python development and the Linux command line. Prepare to explore how uv
can streamline your Python workflow!
1. Installation and Setup
Before we can harness the speed and efficiency of uv
, we need to install it on our Linux system. uv
offers several installation methods, catering to different preferences and system configurations. Its installation process is designed to be straightforward.
Installation Methods
There are primarily three recommended ways to install uv
on Linux:
-
Using the Official Installer Script (curl/sh): This is often the simplest method for a quick setup. It downloads and executes a shell script that installs the pre-compiled
uv
binary for your architecture.- Explanation:
curl
: A command-line tool to transfer data using various protocols (HTTP, in this case).-L
: Follow redirects. If the initial URL points to another location,curl
will follow it.-s
: Silent mode. Suppresses progress meter and error messages (but still shows the result).-S
: Show errors. If silent mode (-s
) is used, this flag makescurl
show an error message if it fails.-f
: Fail fast. Exit without outputting HTML error pages on server errors (like 404).https://astral.sh/uv/install.sh
: The URL of the official installation script.|
: Pipe symbol. Takes the standard output of thecurl
command (the script content) and sends it as standard input to the next command.sh
: The Bourne shell (or a compatible shell like bash). Executes the script received via the pipe.
- Security Note: Piping
curl
tosh
executes code downloaded from the internet directly. While this is a common practice, always ensure you trust the source (astral.sh
is the official domain). You can inspect the script first by downloading it separately:curl -LsSf https://astral.sh/uv/install.sh -o uv_install.sh
and then reviewinguv_install.sh
before runningsh uv_install.sh
. - Installation Location: By default, this script typically installs
uv
to$HOME/.cargo/bin
(if you have Rust/Cargo installed) or$HOME/.local/bin
. It will usually instruct you to add this directory to your system'sPATH
environment variable if it isn't already included. You might need to restart your shell or source your profile file (e.g.,source ~/.bashrc
orsource ~/.zshrc
) for theuv
command to become available.
- Explanation:
-
Using
pipx
:pipx
is a tool specifically designed to install and run Python applications in isolated environments. This is an excellent way to keepuv
and its potential (though currently minimal) dependencies separate from your global or project-specific Python environments.- First, ensure
pipx
is installed. You can usually install it using your system's Python package manager: (You might need to restart your shell after runningensurepath
) - Then, install
uv
usingpipx
: - Explanation:
pipx install uv
creates a dedicated virtual environment just foruv
, installsuv
into it, and adds theuv
executable to a directory on yourPATH
(usually~/.local/bin
), making it globally accessible without polluting your main Python installations.
- First, ensure
-
Using Cargo (Rust's Package Manager): If you are a Rust developer and already have the Rust toolchain (including
cargo
) installed, you can build and installuv
from source via crates.io (the Rust package registry).- Ensure you have Rust installed. If not, visit rustup.rs.
- Install
uv
usingcargo
: - Explanation:
cargo install uv
downloads theuv
source code from crates.io, compiles it locally, and places the resulting binary in$HOME/.cargo/bin
. Make sure$HOME/.cargo/bin
is in yourPATH
. This method ensures you get the latest released version but involves a compilation step which might take some time initially.
-
Using System Package Managers (Less Common Currently): As
uv
is relatively new, it might not be available in the official repositories of all Linux distributions yet. However, it's worth checking:- Arch Linux (AUR):
uv
is often available in the Arch User Repository (AUR). Use an AUR helper likeyay
orparu
:yay -S uv
- Other Distributions: Check community repositories or third-party package sources. Availability will likely increase over time.
- Arch Linux (AUR):
Choose the method that best suits your setup. For most users, the curl | sh
script or pipx
are the recommended starting points.
Verifying the Installation
Once installed, you should verify that uv
is accessible and report its version. Open a new terminal window (or source your shell profile) and run:
You should see output similar to this (the version number will likely differ):
If you get a "command not found" error, double-check that the installation directory ($HOME/.local/bin
or $HOME/.cargo/bin
) is included in your PATH
environment variable (echo $PATH
) and that you've restarted your shell or sourced the appropriate profile file (.bashrc
, .zshrc
, .profile
, etc.).
Initial Configuration
One of uv
's design goals is to require minimal configuration. It generally works out-of-the-box. Key aspects like the cache directory are automatically determined:
- Cache Directory:
uv
uses a central cache to store downloaded packages (wheels), build artifacts, and package metadata. On Linux, this typically defaults to$HOME/.cache/uv
. We will explore the cache in more detail later. - Python Interpreter Discovery: When creating virtual environments,
uv
will attempt to find available Python interpreters on your system. You can explicitly specify a Python interpreter if needed.
For now, no specific configuration steps are required after a successful installation and verification.
Workshop: Installing and Verifying uv
Goal: Install uv
using two different methods on your Linux system and confirm the installation.
Scenario: You are setting up your development environment and want to install uv
. You'll first try the recommended script and then, for practice, try installing it via pipx
(assuming you remove the first installation).
Steps:
-
Method 1: Install using
curl | sh
- Open your Linux terminal.
- Execute the command:
- Observe the output. It might tell you which directory
uv
was installed into and suggest adding it to yourPATH
. - If prompted, add the directory to your
PATH
. For example, if it suggests adding$HOME/.local/bin
, you might edit your~/.bashrc
or~/.zshrc
file and add the line: (Remember to replace/home/your_user/.local/bin
with the actual path if it differs) - Reload your shell configuration: (Alternatively, close and reopen your terminal)
- Verify the installation:
- Note the reported version and the path returned by
which uv
.
-
Cleanup (Optional but recommended for practice):
- Find where
uv
was installed (using the output ofwhich uv
from the previous step). - Remove the
uv
binary. For example, if it was in$HOME/.local/bin/uv
: - Verify it's removed:
- Find where
-
Method 2: Install using
pipx
- Ensure
pipx
is installed. If not: - Verify
pipx
installation: - Install
uv
usingpipx
: - Observe the output.
pipx
will confirm the installation and mention thatuv
is now available. - Verify the
uv
installation again: - Compare the path returned by
which uv
now. It should point to thepipx
binaries directory (often~/.local/bin
). Compare theuv
version with the one installed previously (they might be the same or slightly different depending on release timing).
- Ensure
Outcome: You have successfully installed uv
using two common methods, understand how to verify the installation, and know how to ensure the uv
command is available in your shell's PATH
. You are now ready to explore uv
's core concepts.
2. Core Concepts
Understanding the fundamental ideas behind uv
is crucial for using it effectively. uv
isn't just a faster pip
; it integrates package installation and virtual environment management into a single, cohesive tool.
Dual Role: Installer and Environment Manager
Traditionally, Python developers use separate tools for these tasks:
- Package Installation:
pip
is the standard tool for installing packages from the Python Package Index (PyPI) or other sources. - Virtual Environment Management:
venv
(built into Python 3.3+) orvirtualenv
(a third-party package) are used to create isolated Python environments.
uv
combines these roles under one command-line interface:
uv pip ...
: This subcommand mirrors the functionality ofpip
. You useuv pip install
,uv pip uninstall
,uv pip list
,uv pip freeze
, etc., just like you would with standalonepip
. The key difference is the underlying implementation, which is significantly faster.uv venv ...
: This subcommand handles the creation of virtual environments, similar topython -m venv
orvirtualenv
. It creates an isolated directory containing a specific Python interpreter and site-packages directory.
This integration simplifies the workflow by requiring knowledge of only one tool and command structure for these common operations.
Comparison with pip
and venv
Let's see how common commands map:
Task | pip / venv Command |
uv Command |
Notes |
---|---|---|---|
Install a package | pip install <package> |
uv pip install <package> |
uv is generally much faster. |
Install from file | pip install -r requirements.txt |
uv pip install -r reqs.txt |
uv excels at resolving complex dependencies quickly. |
List installed packages | pip list |
uv pip list |
Output format is similar. |
Show package details | pip show <package> |
uv pip show <package> |
Output format is similar. |
Uninstall a package | pip uninstall <package> |
uv pip uninstall <package> |
Functionally identical. |
Freeze dependencies | pip freeze > requirements.txt |
uv pip freeze > reqs.txt |
Captures installed packages. |
Create virtual env | python -m venv .venv |
uv venv .venv |
uv is significantly faster. |
Create env with specific Py | pythonX.Y -m venv .venv |
uv venv .venv --python X.Y |
uv offers convenient discovery. |
Activate virtual env | source .venv/bin/activate |
source .venv/bin/activate |
Activation mechanism remains the same. |
The primary advantages uv
brings are:
- Speed: Dramatically faster installation and environment creation due to Rust, async I/O, and aggressive caching.
- Unified Interface: One tool for both core tasks.
- Advanced Resolver: A modern, fast dependency resolver capable of handling complex scenarios efficiently.
The uv
Cache
Performance is a cornerstone of uv
, and its caching mechanism is key to achieving this. When you install packages, uv
downloads and potentially builds them. To avoid redundant work, it stores various artifacts in a central cache directory.
- Location: On Linux, the default cache location is
~/.cache/uv
. You can find the exact location by runninguv cache dir
. - Contents: The cache stores:
- Downloaded package files (wheels
.whl
and source distributions.tar.gz
). - Metadata about packages fetched from indexes like PyPI.
- Built wheels (if a package had to be built from a source distribution).
- Registry index information.
- Downloaded package files (wheels
- Benefits:
- Faster Re-installs: If you install the same version of a package in a different virtual environment (or reinstall it in the same one),
uv
can often use the cached artifact directly instead of downloading or rebuilding it. - Offline Installs: If all required packages and their dependencies are already cached, installations can potentially work offline (though metadata checks might still occur).
- Reduced Network Usage: Avoids re-downloading large packages frequently.
- Faster Re-installs: If you install the same version of a package in a different virtual environment (or reinstall it in the same one),
Understanding the cache helps explain uv
's speed, especially after the first time you install a particular package. We will delve deeper into cache management in the Advanced section.
Workshop: Exploring Environments and the Cache
Goal: Create a virtual environment using uv
, activate it, compare its structure to a standard venv
, and inspect the uv
cache.
Scenario: You're starting a new small utility project and want to set up its environment using uv
.
Steps:
-
Create a Project Directory:
-
Create a Virtual Environment with
uv
:- Observe the output. Note how quickly the environment is created compared to
python -m venv .venv
. - The command creates a directory named
.venv
(a common convention) in your current project directory.
- Observe the output. Note how quickly the environment is created compared to
-
Inspect the
uv
Environment Structure:- Look inside the
.venv
directory. You'll find familiar subdirectories:bin/
: Contains activation scripts (activate
,activate.csh
,activate.fish
) and executables linked to the environment's Python interpreter (includingpython
,pip
, and potentiallyuv
itself if specified or found).lib/
: Contains a directory named likepythonX.Y
(e.g.,python3.11
) which holds thesite-packages
directory where installed packages will reside.pyvenv.cfg
: A configuration file specifying details about the environment, like the path to the base Python interpreter.
- Question: Does this structure look significantly different from one created by
python -m venv
? (Answer: No,uv venv
aims for compatibility with the standardvenv
structure).
- Look inside the
-
Activate the Environment:
- Your shell prompt should change, likely prepending
(.venv)
, indicating the environment is active. - Verify which Python interpreter is being used:
(This should point to the Python interpreter inside
.venv/bin/python
)
- Your shell prompt should change, likely prepending
-
Check the
uv
Cache Location:- Note the path displayed. This is where
uv
stores its cached data.
- Note the path displayed. This is where
-
Explore the Cache Directory (Before Installing Anything):
- You might see subdirectories like
archive
,git
,index
,registry
,simple
. At this stage, they might be empty or contain minimal index information.
- You might see subdirectories like
-
Install a Package (will be covered next, but useful for cache inspection):
-
Re-explore the Cache Directory:
- Now, you should see more content, particularly in the
archive
directory, related torequests
and its dependencies (likecharset-normalizer
,idna
,urllib3
,certifi
).
- Now, you should see more content, particularly in the
-
Deactivate the Environment:
- Your shell prompt should return to normal.
Outcome: You have successfully created a virtual environment using uv venv
, activated it, confirmed its standard structure, located the uv
cache directory, and observed how the cache starts getting populated when packages are installed. You now understand the basic operational model of uv
.
3. Basic Package Management
Now that we have uv
installed and understand its core concepts, let's dive into the fundamental package management tasks using the uv pip
subcommand family. These commands are designed to be familiar to users of pip
, but executed with uv
's speed and efficiency. Remember to always activate your virtual environment (source .venv/bin/activate
) before managing packages for a specific project.
Installing Packages
The most common task is installing packages from the Python Package Index (PyPI).
-
Install the latest version:
# Ensure your virtual environment is active source .venv/bin/activate # Install the 'requests' library uv pip install requests
uv
will resolve the dependencies forrequests
, download the necessary files (checking the cache first), and install them into your virtual environment'ssite-packages
directory. You'll notice this process is significantly faster than using standardpip
, especially for packages with many dependencies or if artifacts are already cached. -
Install multiple packages at once:
Simply list the packages you want to install, separated by spaces.uv
will resolve the dependencies for all requested packages together.
Installing Specific Package Versions
Often, you need to install a specific version of a package for compatibility reasons or reproducible builds.
- Exact version: Use
==
- Minimum version: Use
>=
- Compatible version (PEP 440): Use
~=
(e.g.,~=1.2.3
means>=1.2.3
and<1.3.0
) (Note: Quotes are often recommended, especially when using comparison operators, to prevent the shell from interpreting them) - Version range: Use multiple comma-separated specifiers
uv
's resolver will find versions that satisfy these constraints.
Installing from requirements.txt
Managing dependencies for projects typically involves listing them in a requirements file (commonly requirements.txt
). uv
fully supports installing dependencies from these files.
- Create a
requirements.txt
file in your project directory: - Install all packages listed in the file:
uv
reads the file, resolves all specified dependencies, and installs them. This is the standard way to set up a project's environment based on its declared dependencies. You can use the-r
flag multiple times to install from several files.
Listing Installed Packages
To see which packages (and their versions) are currently installed in your active virtual environment:
This command outputs a list similar to pip list
, showing the package name and its installed version.
Showing Package Details
To get more information about a specific installed package, including its dependencies, location, author, etc.:
This is helpful for understanding a package's requirements or finding its installation path within the virtual environment.
Uninstalling Packages
To remove a package from your virtual environment:
uv
will ask for confirmation (unless you use the -y
flag) and then remove the specified package. Note that uv
(like pip
) typically does not automatically uninstall dependencies that are no longer needed by other packages. Managing unused dependencies often requires tools like pip-autoremove
or careful use of uv pip sync
(covered later).
You can also uninstall packages listed in a requirements file:
Workshop: Setting Up a Simple Project
Goal: Create a small project, initialize its environment with uv
, install some packages, pin dependencies to a requirements.txt
, and practice basic package operations.
Scenario: You are creating a simple command-line tool that fetches data from an API using requests
and displays it nicely using rich
.
Steps:
-
Setup Project Directory and Environment:
-
Install Initial Packages:
- Install the latest versions of
requests
andrich
. - Observe the installation process. Note the speed.
- Install the latest versions of
-
List Installed Packages:
- Check what was installed (including dependencies):
- You should see
requests
,rich
, and their dependencies (likeurllib3
,certifi
,markdown-it-py
,pygments
, etc.).
-
Inspect a Package:
- Get more details about the
requests
package: - Note its version, location (
.../.venv/lib/pythonX.Y/site-packages
), and its listed requirements.
- Get more details about the
-
Pin Dependencies to
requirements.txt
:- Use
uv pip freeze
to generate a list of all installed packages with their exact versions and save it torequirements.txt
. - Inspect the contents of the
requirements.txt
file: - You'll see lines like
requests==X.Y.Z
,rich==A.B.C
, and entries for all dependencies. This file precisely records the state of your environment, making it reproducible.
- Use
-
Simulate Recreating the Environment:
- First, uninstall one of the main packages, for example,
rich
. - Verify it's gone:
- Now, reinstall everything exactly as specified in
requirements.txt
: - Verify
rich
is back: - Notice that this install should be extremely fast, likely using cached artifacts for all packages.
- First, uninstall one of the main packages, for example,
-
Cleanup:
Outcome: You have successfully used uv pip
commands to install packages by name and from a requirements file, list installed packages, inspect package details, generate a pinned dependency file (requirements.txt
) using freeze
, and uninstall packages. You've experienced how uv
handles these fundamental tasks efficiently.
4. Virtual Environment Management
While uv pip
handles package installation, uv venv
is its counterpart for creating and managing the isolated Python environments where those packages live. As we saw earlier, uv venv
provides a significantly faster alternative to Python's built-in venv
module.
Creating Environments
The basic command creates a virtual environment in a specified directory (conventionally .venv
):
If the .venv
directory doesn't exist, uv
will create it. If it already exists, uv
might refuse to overwrite it unless specific flags are used (this is a safety measure).
- Specifying a Path: You can provide any path for the environment:
- Speed: The creation process is noticeably faster than
python -m venv
becauseuv
avoids copying the standard library and instead uses symlinks or other efficient methods where possible, and leverages its Rust implementation.
Specifying Python Interpreters
By default, uv venv
attempts to create the environment using the same Python interpreter that uv
itself was installed with or the python
/python3
found first in your PATH
. However, you often need to create environments using a specific Python version installed on your system.
The --python
(or -p
) flag allows you to specify which Python interpreter to use:
-
By Version:
uv
will search common locations on your system (PATH
, potentially locations managed by tools likepyenv
orasdf
) for an interpreter matchingpython3.11
. -
By Full Path:
-
By Name:
If uv
cannot find a matching interpreter, it will report an error. This feature is incredibly useful for ensuring your project runs against the intended Python version. You can see which Pythons uv
finds using uv python find
.
Activating and Deactivating Environments
uv venv
creates environments with the standard activation scripts. Activating and deactivating works exactly the same way as with environments created by venv
or virtualenv
.
- Activate (Bash/Zsh):
- Activate (Fish):
- Activate (Csh/Tcsh):
- Deactivate (Any Shell):
Activation modifies your shell's PATH
and other environment variables so that commands like python
and pip
(or uv pip
) refer to the versions inside the virtual environment.
Differences from python -m venv
While uv venv
creates environments with a compatible structure, there are key differences:
- Speed: As mentioned,
uv venv
is significantly faster. - Seeding: By default,
uv venv
does not installpip
orsetuptools
into the new environment. It assumes you will useuv pip
to manage packages within that environment. This makes environment creation even faster and lighter. You can explicitly request seeding using flags like--seed
. If you create an environment without seeding and then activate it, the standardpip
command might not be available initially (thoughuv pip install ...
will still work perfectly). - Python Discovery:
uv venv
has more sophisticated built-in logic (--python
flag) for finding different Python versions compared to the basicpythonX.Y -m venv
approach.
Managing Multiple Environments
You can easily manage multiple virtual environments for different projects or even for the same project (e.g., testing against different Python versions).
- Different Projects: Simply create a
.venv
inside each project directory: - Same Project, Different Pythons: Use distinct names or locations:
Tools like direnv
can automate the activation/deactivation process when you cd
into different project directories.
Workshop: Multi-Python Environment Setup
Goal: Create two different virtual environments for the same project, targeting different Python versions available on your system. Install different package versions in each and practice switching between them.
Scenario: You need to ensure your project works correctly with both Python 3.10 and Python 3.11 (assuming both are installed on your Linux system).
Prerequisites: You need at least two different minor versions of Python 3 installed (e.g., 3.10 and 3.11). You can check available versions with commands like ls /usr/bin/python*
or using version managers like pyenv
. If you only have one, you can try creating environments with and without the --seed
flag to observe the difference in installed tools. For this workshop, we'll assume you have python3.10
and python3.11
.
Steps:
-
Create Project Directory:
-
Create Environment for Python 3.10:
- Use
uv venv
with the--python
flag. - If
uv
cannot find Python 3.10 automatically, you might need to provide the full path (e.g.,uv venv .venv-310 --python /usr/bin/python3.10
). Checkuv python find
for discovered interpreters.
- Use
-
Create Environment for Python 3.11:
- Similarly, create an environment for Python 3.11.
(Again, adjust the
--python
value if needed based on your system)
- Similarly, create an environment for Python 3.11.
(Again, adjust the
-
Inspect Environments (Optional):
- Check the Python versions linked within each environment:
- Confirm they point to the correct base interpreters.
-
Work in the Python 3.10 Environment:
- Activate the 3.10 environment:
- Verify the active Python version:
- Install a specific, older version of a package:
- Check installed packages:
-
Switch to the Python 3.11 Environment:
- Deactivate the current environment:
- Activate the 3.11 environment:
- Verify the active Python version:
- Install a newer version of the same package:
- Check installed packages in this environment:
- Notice that the package list (and Flask version) is different from the
.venv-310
environment, demonstrating their isolation.
-
Cleanup:
Outcome: You have successfully used uv venv
to create multiple, isolated virtual environments targeted at specific Python interpreters. You practiced activating, deactivating, installing different package versions in each, and switching between environments, reinforcing the concept of environment isolation and uv
's ability to manage specific Python versions.
5. Dependency Resolution
One of the most significant advancements uv
offers over traditional pip
(especially older versions) is its extremely fast and robust dependency resolver. Understanding how uv
handles dependencies is key to appreciating its power, particularly in complex projects.
The Challenge of Dependency Resolution
Python packages often depend on other packages, which in turn may have their own dependencies, forming a potentially complex graph. Dependency resolution is the process of finding a set of specific package versions that satisfies all the requirements of your project and all its dependencies, without conflicts.
Conflicts arise when different packages require incompatible versions of the same shared dependency. For example:
package-A
requirescommon-lib>=1.0,<2.0
package-B
requirescommon-lib>=1.5,<1.8
package-C
requirescommon-lib==1.9
A resolver needs to find a version of common-lib
that satisfies all constraints (impossible in the C example) or report an error if no solution exists. Older versions of pip
used a simple backtracking algorithm that could be very slow and sometimes choose sub-optimal or inconsistent solutions, especially in large environments.
uv
's High-Performance Resolver
uv
employs a modern dependency resolver, likely inspired by algorithms used in tools like Cargo
(Rust) or PubGrub
(Dart). Key characteristics include:
- Speed: Written in Rust and optimized for performance, it resolves complex dependency graphs much faster than
pip
's legacy resolver or even its newer resolver in many cases. This is achieved through efficient data structures, algorithms, and leveraging cached package metadata. - Accuracy: It aims to find a valid set of dependencies if one exists according to the specified constraints.
- Conflict Reporting: When conflicts are detected (i.e., no single version set can satisfy all requirements),
uv
provides clearer and more informative error messages, helping you pinpoint the source of the incompatibility much faster than traditional tools often did.
Understanding Resolution Output
When you run uv pip install ...
, uv
performs resolution implicitly. You can see this in action:
# Activate environment first
# source .venv/bin/activate
# Example: Install 'apache-airflow', known for many dependencies
uv pip install apache-airflow
Observe the output. uv
will typically show steps like:
Resolved N packages in Xs
(where N is the number of packages and X is the time taken) - This is the core resolution step.Downloaded N packages in Xs
- Fetching required packages (potentially hitting the cache).Installed N packages in Xs
- Placing packages into the environment.
The speed of the "Resolved" step is where uv
truly shines compared to pip
. If there are conflicts, the output will detail which packages have incompatible requirements.
Lock Files vs. Pinned Requirements (uv pip compile
)
While uv
excels at installing dependencies based on abstract requirements (like requests>=2.0
), ensuring reproducible builds often requires pinning down the exact versions of all packages (including indirect dependencies).
- Traditional Lock Files: Tools like
Poetry
andPDM
use dedicatedpoetry.lock
orpdm.lock
files. These files record the exact versions, hashes, and dependency tree determined by the resolver. Installing from a lock file guarantees the same environment every time.pip-tools
providespip-compile
which generates a fully pinnedrequirements.txt
from an abstractrequirements.in
. uv
's Approach (uv pip compile
):uv
adopts an approach similar topip-tools
. It provides theuv pip compile
command to take abstract requirements (e.g., from arequirements.in
orpyproject.toml
file) and generate a fully pinnedrequirements.txt
file.
How uv pip compile
Works:
- Input: Takes one or more input files containing abstract requirements (e.g.,
requirements.in
,pyproject.toml
). - Resolution: Uses its fast resolver to find a compatible set of specific versions for all direct and indirect dependencies.
- Output: Generates a
requirements.txt
file (by default, or use-o <output_file>
) containing pinned versions of all packages, often including comments indicating dependencies and hashes for security/integrity. The resultingrequirements.txt
might look like:# # This file was generated by uv pip compile # from requirements in requirements.in # click==8.1.7 # via flask colorama==0.4.6 # via flask flask==3.0.0 # via requirements.in itsdangerous==2.1.2 # via flask jinja2==3.1.2 # via flask markupsafe==2.1.3 # via jinja2 # ... (requests and its dependencies) ... requests==2.31.0 # via # requirements.in # ... etc ...
- Installation: You then install from this generated, fully pinned
requirements.txt
for reproducible environments:
This compile
step separates the process of finding compatible versions from the process of installing them, providing control and reproducibility.
Workshop: Resolving Dependencies with uv pip compile
Goal: Use uv pip compile
to generate a pinned requirements file from abstract requirements, demonstrating uv
's resolution capabilities.
Scenario: Your project requires fastapi
and an older version of sqlalchemy
which might have overlapping dependencies. You want to generate a reliable, pinned requirements.txt
.
Steps:
-
Setup Project Directory and Environment:
-
Create
requirements.in
:- Create a file named
requirements.in
with the following abstract dependencies: - These packages have their own complex dependency trees.
- Create a file named
-
Run
uv pip compile
:- Use
uv pip compile
to resolve these dependencies and generaterequirements.txt
. Add the--generate-hashes
flag for extra security. - Observe the output.
uv
will show the resolution process. Note the speed.
- Use
-
Examine the Generated
requirements.txt
:- Open and inspect the
requirements.txt
file: - Notice several things:
- It contains many more packages than just
fastapi
andsqlalchemy
. These are the transitive dependencies. - Every package has an exact version specified (e.g.,
sqlalchemy==1.4.x
). - There are comments indicating which top-level requirement caused each package to be included.
- Each package entry includes hashes (
--hash=sha256:...
). Installing with hashes ensures the downloaded package file hasn't been tampered with.
- It contains many more packages than just
- Open and inspect the
-
Install from the Pinned File:
- Now, install the dependencies using the generated file:
- This installation should be very fast and deterministic because
uv
knows exactly which versions and files to get (potentially from cache).
-
Simulate a Conflict (Optional):
- Modify
requirements.in
to introduce a conflict: - Try compiling again:
uv
should now fail and produce an error message explaining the conflict (e.g.,fastapi
requirespydantic>=1.10
but you specifiedpydantic==1.9.0
). Analyze the error message –uv
usually provides good context on conflicting requirements.- Revert
requirements.in
to the working version.
- Modify
-
Cleanup:
Outcome: You have successfully used uv pip compile
to resolve dependencies from an abstract requirements.in
file into a fully pinned requirements.txt
, complete with hashes. You've seen how this provides reproducible environments and how uv
reports resolution conflicts.
6. Working with requirements files
While uv pip install -r requirements.txt
and uv pip compile
are fundamental, uv
offers more advanced ways to interact with requirements files, enhancing reproducibility and development workflows, particularly through the sync
command and support for various requirement specifiers.
Advanced requirements.txt
Syntax
uv
supports the standard pip
requirements file format, including several useful features:
-
Hashes: As seen with
If the downloaded file's hash doesn't match any of the provided ones,uv pip compile --generate-hashes
, you can include expected hashes for packages:uv
will raise an error. This prevents installing compromised or unexpected package versions.uv
strongly encourages using hashes. -
Environment Markers: You can specify that a requirement should only be installed under certain conditions (Python version, OS, etc.):
# Only install 'pywin32' on Windows pywin32 >= 1.0 ; sys_platform == 'win32' # Install 'typing-extensions' only on Python versions less than 3.8 typing-extensions >= 4.0 ; python_version < '3.8'
uv
correctly evaluates these markers during resolution and installation. -
Editable Installs (
-e
): This is crucial for developing local packages. It installs a package directly from your project's source code directory in a way that changes you make to the code are immediately reflected in the environment (without needing reinstallation).- Typically used with a
.
to refer to the current directory, assuming it contains apyproject.toml
orsetup.py
: - When you run
uv pip install -r requirements.txt
(orsync
),uv
will set up a link to your project's source, allowing development.
- Typically used with a
-
Version Control System (VCS) Links: Install directly from Git, Mercurial, etc.:
Synchronizing Environments with uv pip sync
While uv pip install -r requirements.txt
ensures that all packages listed in the file are present, it does not remove packages that might already be in the environment but are not listed in the file. This can lead to environment drift, where unused or outdated packages accumulate.
The uv pip sync
command solves this. It makes the environment exactly match the contents of one or more requirements files.
-
Usage:
-
Behavior:
uv
reads the specified requirements file(s).- It compares the list of required packages (and their exact versions if pinned) against the currently installed packages in the active virtual environment.
- It installs any missing packages.
- It upgrades or downgrades any packages that are installed but at the wrong version.
- Crucially, it uninstalls any packages currently in the environment that are not required by the specifications in the requirements file(s).
-
Benefit:
sync
guarantees that your environment contains only what's specified, providing strict reproducibility and preventing issues caused by leftover packages. It's often the preferred command for setting up environments in CI/CD pipelines or ensuring consistency across developer machines, especially when used with a fully pinnedrequirements.txt
generated byuv pip compile
.
Comparing install -r
, sync
, and freeze
uv pip install -r file.txt
: Ensures packages infile.txt
are installed. Doesn't touch unrelated packages already present. Good for adding dependencies or ensuring minimum versions.uv pip sync file.txt
: Makes the environment exactly matchfile.txt
. Installs missing, updates incorrect versions, removes extraneous packages. Best for reproducibility with pinned requirements.uv pip freeze
: Outputs a list of currently installed packages in the environment, usually with pinned versions. Useful for capturing the current state, often piped into arequirements.txt
file (thoughuv pip compile
is generally preferred for generating pinned files from abstract requirements).
Workshop: Editable Installs and Environment Synchronization
Goal: Create a simple local Python package, install it in editable mode along with other dependencies using uv pip compile
and uv pip sync
, and observe how sync
manages the environment.
Scenario: You are developing a small utility library (mylib
) and an application (myapp
) that uses it. You want to manage dependencies using requirements.in
, generate a pinned requirements.txt
, install mylib
editably, and ensure the environment is kept clean using sync
.
Steps:
-
Project Setup:
- Create the main project directory and the structure for your library:
mkdir sync_project cd sync_project mkdir mylib touch mylib/__init__.py touch mylib/utils.py # Create a basic pyproject.toml for the library cat << EOF > mylib/pyproject.toml [project] name = "mylib" version = "0.1.0" [build-system] requires = ["setuptools>=61.0"] build-backend = "setuptools.build_meta" EOF # Add a simple function to the library echo "def hello(): return 'Hello from mylib!'" > mylib/utils.py
- Create the main project directory and the structure for your library:
-
Create Application Code (Optional but illustrative):
- Create a simple app file that uses the library:
-
Set up Virtual Environment:
-
Create
requirements.in
:- List the application's direct dependencies, including the editable local library:
-
Compile Requirements:
- Generate the pinned
requirements.txt
including hashes: - Examine
requirements.txt
. Notice it includesrequests
and its dependencies pinned, but the line formylib
remains-e ./mylib
.compile
preserves editable requirements.
- Generate the pinned
-
Synchronize the Environment:
- Use
uv pip sync
to install exactly what's specified: - Verify the installation:
- Test the application:
- Use
-
Modify the Library and Observe Editable Install:
- Edit the library code:
- Run the application again without reinstalling:
-
Simulate Environment Drift and Correct with
sync
:- Manually install an extra package not in
requirements.txt
: - Now, run
sync
again. This command ensures the environment matches only what's inrequirements.txt
. - Observe the output.
uv
should report that it's uninstallingrich
. - Verify
rich
is gone:
- Manually install an extra package not in
-
Cleanup:
Outcome: You have successfully set up a project with a local editable package, managed dependencies using requirements.in
and uv pip compile
, installed them reliably using uv pip sync
, observed the effect of editable installs, and used sync
to remove extraneous packages, ensuring a clean and reproducible environment.
7. Caching Mechanisms
As highlighted earlier, uv
's remarkable speed is heavily influenced by its sophisticated caching system. Understanding how this cache works, what it stores, and how to manage it can help you maximize uv
's benefits and troubleshoot potential issues.
Detailed Caching Strategy
uv
doesn't just cache downloaded files; it caches various artifacts generated throughout the package resolution and installation process. The goal is to minimize redundant work, network access, and computation.
-
Cache Directory Structure: Located by default at
~/.cache/uv
on Linux (confirm withuv cache dir
), the cache typically contains subdirectories like:registry
: Stores cached responses from package index servers (like PyPI). This speeds up metadata fetching for dependency resolution.index
: Caches the package index itself, reducing the need to query PyPI repeatedly for available package versions.archive
: Stores downloaded package files, primarily wheels (.whl
) and source distributions (.tar.gz
,.zip
). This is the most direct saving on network bandwidth.build
: If a package only has a source distribution available and needs to be built locally into a wheel,uv
caches the resulting built wheel here. Subsequent installs of the same version (even in different virtual environments) can reuse this built wheel, avoiding potentially lengthy compilation steps (e.g., for packages with C extensions likenumpy
orcryptography
).git
: Caches clones or checkouts of repositories specified via VCS requirements (e.g.,git+https://...
).
-
Cache Keys:
uv
uses robust hashing and content addressing to determine when cached items are valid. This generally involves hashing package files, URLs, build contexts, and relevant environment factors (like Python version) to ensure that a cached artifact is appropriate for the current request.
How Caching Speeds Up Installs and Builds
- Metadata Resolution: Cached index and registry data allow
uv
to perform dependency resolution much faster, often without needing extensive communication with PyPI, especially for frequently requested packages. - Downloads: If a specific version of a wheel or sdist is already in the
archive
cache,uv
uses the local copy directly, skipping the download entirely. - Builds: This is a major performance win. If
uv
needsnumpy-1.26.4
for Python 3.11 on x86_64 Linux, and it has previously built this specific combination from source, it retrieves the wheel from thebuild
cache. This avoids recompiling complex C/Fortran code, saving significant time (potentially minutes).
The benefits accumulate significantly in CI/CD environments or when developers frequently create or update virtual environments.
Managing the Cache
uv
provides commands to inspect and manage its cache:
-
uv cache dir
: Prints the path to the cache directory. -
uv cache clean
: Removes cached data. You can specify package names to clear their specific entries or run it without arguments to clear the entire cache (use with caution).Clearing the cache can be useful for troubleshooting issues that might be caused by corrupted cache entries or to reclaim disk space. However, doing a full clean will negate# Clear cache entries related to the 'requests' package uv cache clean requests # Clear cache entries for 'requests' and 'flask' uv cache clean requests flask # Clear the ENTIRE uv cache (downloads, builds, metadata, etc.) # This forces uv to re-download and potentially rebuild everything on next install uv cache clean
uv
's speed benefits until the cache is repopulated.
Cache Invalidation
uv
automatically handles cache invalidation in most cases. It knows when a cached artifact is no longer suitable based on changes in requirements (e.g., requesting a different version), the Python interpreter version, or other context.
However, edge cases might exist, especially related to system-level changes that uv
might not be aware of (e.g., updating a system C library that a previously built wheel depends on). If you suspect a stale cache entry is causing problems, clearing the cache for the problematic package(s) or performing a full uv cache clean
is a reasonable troubleshooting step.
Forcing Re-downloads or Rebuilds (--no-cache
, --refresh
)
Sometimes, you might want to bypass the cache intentionally, perhaps to ensure you're getting the absolute latest metadata from an index or to force a rebuild.
-
--no-cache
: This flag, used with commands likeuv pip install
, tellsuv
to avoid using the cache for downloads. It will always fetch packages from the index, but it might still use cached builds if available. -
--refresh
/--refresh-package <PACKAGE>
: These flags, used withuv pip install
oruv pip sync
, forceuv
to re-validate cached metadata for packages against the remote index. This can be useful if the index has changed since the metadata was last cached. -
--no-build-isolation
/--no-build
(Use with care): These flags influence the build process and might indirectly affect caching, but their primary purpose is different. Forcing a rebuild specifically, without other side effects, is often best achieved by clearing the relevant cache entry (uv cache clean <package>
) and reinstalling.
Workshop: Observing Cache Performance and Management
Goal: Observe the performance difference uv
's cache makes, explore the cache directory, and practice cleaning the cache.
Scenario: You will install a large package known for having compiled components (numpy
), first with a cold cache, then with a warm cache. You will then inspect the cache and clean it.
Steps:
-
Clear the
uv
Cache (Start Fresh):- To ensure a fair comparison, completely clear any existing
uv
cache data. - Verify the cache directory is now mostly empty (some structure might remain):
- To ensure a fair comparison, completely clear any existing
-
Create First Environment and Install
numpy
(Cold Cache):- Set up a new environment:
- Time the installation of
numpy
. Use thetime
command (a shell built-in or/usr/bin/time
). - Note the time taken (particularly the "real" time). This installation involves downloading
numpy
(and potentially building it if no pre-built wheel is available for your exact platform/Python version) and populating the cache.
-
Inspect the Cache Contents:
- Deactivate the environment for now.
- Explore the cache directory again. Pay attention to the
archive
andbuild
subdirectories.
-
Create Second Environment and Install
numpy
(Warm Cache):- Create a completely separate environment:
- Time the installation of the same version of
numpy
again: - Crucially, compare the time taken now to the first installation. It should be significantly faster.
uv
should have found the necessarynumpy
wheel (either downloaded or built) in its cache (~/.cache/uv
) and reused it directly.
-
Clean Cache for a Specific Package:
- Deactivate the environment.
- Clean only the cache entries related to
numpy
: - Verify that
numpy
-related files are gone from the cache (e.g., checkarchive
andbuild
again). Other cached items should remain.
-
Re-install (Should be slower again):
- Activate one of the environments again (e.g.,
cache_test_1
): - Uninstall numpy first (to force re-installation):
- Time the installation one more time:
- The time should be closer to the initial "cold cache" time, as
uv
needs to re-download or potentially rebuildnumpy
since its cache entry was removed.
- Activate one of the environments again (e.g.,
-
Cleanup:
Outcome: You have directly experienced the significant performance impact of uv
's caching. You observed how installations are much faster when artifacts are cached, explored the cache directory structure, and learned how to manage the cache using uv cache clean
for specific packages or the entire cache.
8. Configuration and Customization
While uv
aims for sensible defaults, you can customize its behavior through environment variables and command-line flags, especially useful for specific project needs, CI/CD pipelines, or working with private package indexes.
Environment Variables
Several environment variables can influence uv
's operation:
-
UV_CACHE_DIR
: Overrides the default cache directory location (~/.cache/uv
on Linux).This is useful if your home directory has limited space or if you want to place the cache on a faster storage device.# Run uv using a custom cache location for this command only UV_CACHE_DIR=/tmp/uv_cache_alt uv pip install requests # Set it for the current shell session export UV_CACHE_DIR=/mnt/fast_ssd/uv_cache uv pip install ... # Installs will now use /mnt/fast_ssd/uv_cache unset UV_CACHE_DIR # Return to default
-
UV_INDEX_URL
: Sets the primary Python package index URL, overriding the default (PyPI). Equivalent to the--index-url
flag. -
UV_EXTRA_INDEX_URL
: Specifies additional index URLs to consult, separated by spaces. Equivalent to the--extra-index-url
flag. -
UV_NO_CACHE
: If set to a non-empty value (e.g.,1
), acts like the--no-cache
flag for alluv
invocations. -
UV_NATIVE_TLS
/UV_CUSTOM_CA
: Control TLS (HTTPS) settings.UV_NATIVE_TLS=1
forces use of the operating system's native TLS implementation instead ofrustls
.UV_CUSTOM_CA=/path/to/ca.pem
specifies a custom certificate authority bundle, useful for corporate proxies or private indexes using self-signed certificates. Equivalent to--native-tls
and--custom-ca
flags. -
VIRTUAL_ENV
: While not specific touv
,uv
respects this standard environment variable. IfVIRTUAL_ENV
is set (typically done automatically bysource .venv/bin/activate
),uv pip install
and other commands will operate on that environment by default.
Using environment variables is often convenient for setting persistent configurations within a specific shell session or for configuring uv
within CI/CD pipeline definitions.
Command-Line Flags for Customization
Many configuration options are available as flags for individual command invocations:
-
Verbosity:
-v
,--verbose
: Increase output detail. Can be repeated (-vv
,-vvv
) for more verbosity, useful for debugging.-q
,--quiet
: Decrease output detail. Can be repeated (-qq
) for near silence (only errors). Useful for scripts.
-
Caching:
--no-cache
: Disables the download cache (still uses build cache).--cache-dir <DIR>
: Specifies a cache directory for this run.
-
Networking & Indexes:
--index-url <URL>
: Sets the primary package index URL.--extra-index-url <URL>
: Adds an extra index URL (can be used multiple times).--find-links <PATH>
: Looks for packages in a local directory or HTML file instead of an index.--native-tls
: Use the OS's native TLS implementation.--custom-ca <FILE>
: Path to a custom CA bundle file.--offline
: Run in offline mode. Fails if packages need to be downloaded. Relies entirely on the cache.
-
Dependency Handling:
--no-deps
: Install packages without installing their dependencies.--require-hashes
: Abort installation if any requirement lacks a--hash
. (Used withinstall
,sync
).
-
Environment Creation (
uv venv
):--python <PYTHON>
: Specify Python interpreter version or path.--seed
: Installpip
,setuptools
, andwheel
into the new environment.--system-site-packages
: Give the virtual environment access to the system's site-packages (generally discouraged).
Consult uv --help
, uv pip --help
, uv venv --help
, etc., for a complete list of flags for each subcommand.
Integration with Other Tools
pre-commit
: You can useuv
within yourpre-commit
hooks for tasks like linting or formatting, potentially speeding up hook execution if dependencies need installation. Define a hook that usesuv pip install
in itsentry
.- CI/CD Pipelines (e.g., GitHub Actions, GitLab CI):
uv
is ideal for CI.- Install
uv
: Use thecurl | sh
script or download a specific release binary. - Cache
uv
's cache: Use the CI platform's caching mechanism to persistUV_CACHE_DIR
between runs. This dramatically speeds up dependency installation steps after the first run. - Install Dependencies: Use
uv pip sync requirements.txt
(with a compiled, pinned file) for fast and reproducible environment setup.
- Install
Using Private Package Indexes
Many organizations host their own Python packages on private index servers (like devpi
, Nexus
, Artifactory
, pypiserver
). uv
supports these seamlessly:
-
Using only a private index: Set the primary index URL.
-
Using PyPI and a private index: Set the primary index to PyPI (or leave as default) and add the private index as an extra.
uv
will check the primary first, then the extras. -
Authentication: If your private index requires authentication,
uv
currently relies on standard methods like embedding credentials in the URL (https://user:password@...
) or using tools likekeyring
. Support for authentication mechanisms might evolve. Checkuv
's documentation for the latest recommendations. Use secure methods for handling credentials, especially in CI/CD. -
Custom CA Certificates: If your private index uses TLS/SSL certificates signed by an internal Certificate Authority, use the
--custom-ca
flag orUV_CUSTOM_CA
environment variable to pointuv
to the appropriate CA bundle file.
Workshop: CI/CD Simulation and Private Index
Goal: Simulate a CI pipeline step using uv
with specific flags and configure uv
to install a package from a local mock PyPI server.
Scenario: You'll create a basic shell script mimicking a CI job that installs dependencies quietly and using a specific cache directory. Then, you'll set up a simple local PyPI server, upload a dummy package, and install it using uv
configured for that local index.
Part 1: CI Simulation
-
Create Project Files:
-
Create CI Script (
ci_step.sh
):#!/bin/bash set -e # Exit immediately if a command exits with a non-zero status. echo "--- Setting up Environment ---" # Define a temporary cache directory for this 'run' export UV_CACHE_DIR=$(pwd)/.uv_cache_ci mkdir -p $UV_CACHE_DIR echo "Using cache directory: $UV_CACHE_DIR" # Create virtual environment uv venv .venv --quiet # Use quiet flag # Activate (needed for subsequent uv commands to target the env implicitly) source .venv/bin/activate echo "--- Installing Dependencies ---" # Install using sync for reproducibility, quietly # Use -vv for debugging if needed time uv pip sync requirements.txt --quiet echo "--- Running Checks (Placeholder) ---" # In a real CI, you'd run tests, linters etc. uv pip list --quiet echo "Dependencies installed successfully." echo "--- Cleaning Up ---" deactivate # In a real CI, cache $UV_CACHE_DIR here based on requirements.txt hash echo "CI step finished."
- Make the script executable:
chmod +x ci_step.sh
- Make the script executable:
-
Run the CI Script:
- Observe the output. It should be less verbose due to
--quiet
. Note the timing for thesync
command. Run it a second time – thesync
step should be much faster due to the.uv_cache_ci
directory acting as the cache for this simulated run.
- Observe the output. It should be less verbose due to
Part 2: Private Index Simulation
-
Install
pypiserver
: You'll need a simple PyPI server. Install it globally or in a dedicated environment (usingpip
oruv pip
): -
Create a Dummy Package:
mkdir dummy_package cd dummy_package mkdir my_dummy_pkg touch my_dummy_pkg/__init__.py echo "VERSION = '0.1.0'" > my_dummy_pkg/version.py echo "def greet(): return 'Hello from Dummy Package'" >> my_dummy_pkg/__init__.py cat << EOF > setup.py from setuptools import setup, find_packages setup( name='my-dummy-pkg', version='0.1.0', packages=find_packages(), description='A simple dummy package', ) EOF # Build the package (wheel) - requires build tools # Ensure build is installed in your active env or globally # uv pip install build python -m build --wheel # This creates a 'dist' directory with the .whl file cd ..
-
Set up Package Directory for
pypiserver
: -
Run
pypiserver
:- Open a new terminal window.
- Navigate to the directory containing
local_pypi_packages
. - Run the server:
- It will typically start serving on
http://localhost:8080
. Keep this terminal running.
-
Install from Local Index using
uv
:- Go back to your original terminal (
ci_simulation
directory or a new one). - Create/activate a test environment:
- Attempt to install the dummy package (this should fail as it's not on PyPI):
- Now, install specifying your local index URL and instructing
uv
to trust the HTTP connection (aspypiserver
runs on HTTP by default): (Note: The--trusted-host
flag is pip's way;uv
might handle this implicitly or require different handling for non-HTTPS indexes based on version. If--trusted-host
isn't auv
flag, try settingUV_INDEX_URL=http://localhost:8080/simple
and runninguv pip install my-dummy-pkg
. Checkuv
documentation if needed.) - Verify installation:
- Go back to your original terminal (
-
Cleanup:
- Stop the
pypiserver
(Ctrl+C in its terminal). - Deactivate the environment in the main terminal.
cd ../..
- Optional:
rm -rf ci_simulation local_index_test dummy_package local_pypi_packages
- If you installed
pypiserver
viapipx
:pipx uninstall pypiserver
. If viauv
, remove thepypiserver_env
directory.
- Stop the
Outcome: You've simulated a basic CI dependency installation step using uv
with customization flags (--quiet
, UV_CACHE_DIR
). You also successfully set up a local PyPI server, built and served a package, and configured uv
using --index-url
to install from that private source. This demonstrates uv
's flexibility in different environments and workflows.
9. Integration with Build Backends and pyproject.toml
Modern Python packaging heavily relies on the pyproject.toml
file (introduced in PEP 518 and expanded by subsequent PEPs like 517 and 621). This file serves as a central configuration point for build system requirements, the build backend itself, and project metadata. uv
is designed to work seamlessly within this ecosystem.
The Role of pyproject.toml
This TOML file standardizes how Python projects declare their build dependencies and choose their build backend.
-
[build-system]
Table (PEP 518):requires
: A list of packages needed to build your project (e.g.,setuptools
,wheel
,hatchling
,flit_core
).build-backend
: The Python object (usually provided by a package listed inrequires
) that build frontends (likeuv
,pip
,build
) should call to perform the build (e.g., build a wheel or sdist). Common backends includesetuptools.build_meta
,hatchling.build
,flit_core.buildapi
.backend-path
: Optional list of directories to add tosys.path
when calling the backend.
-
[project]
Table (PEP 621):- Standardizes how project metadata (name, version, description, dependencies, authors, license, etc.) is specified directly in
pyproject.toml
, reducing reliance onsetup.py
orsetup.cfg
for metadata. - Example:
[project] name = "my-cool-package" version = "1.0.0" description = "A brief description." readme = "README.md" requires-python = ">=3.8" license = { file = "LICENSE" } authors = [ { name = "Ada Lovelace", email = "ada@example.com" } ] dependencies = [ "requests>=2.20", "rich", "importlib-metadata; python_version<'3.10'", # Environment marker example ] [project.optional-dependencies] dev = ["pytest", "ruff"]
- Standardizes how project metadata (name, version, description, dependencies, authors, license, etc.) is specified directly in
How uv
Interacts with Build Backends
When uv
needs to install a package that doesn't have a pre-built wheel available on the index for your platform/Python version, or when you install a local project in editable mode (uv pip install -e .
), it needs to build the package from source.
- Identify Build System:
uv
reads thepyproject.toml
file of the package being installed. - Isolate Build Environment: It creates a temporary, isolated build environment.
- Install Build Dependencies: It installs the packages listed in
[build-system].requires
(e.g.,setuptools
,hatchling
) into this isolated environment using its own fast installer. - Invoke Build Backend: It calls the specified
build-backend
functions (defined by PEP 517) within the isolated environment to perform actions like:build_wheel()
: Builds a.whl
file.build_sdist()
: Builds a source distribution (.tar.gz
).prepare_metadata_for_build_wheel()
: Generates metadata without a full build (used for dependency resolution).
- Install Result:
uv
takes the built artifact (usually a wheel) and installs it into the target virtual environment.
Because uv
uses its own fast mechanisms for steps 3 and 5, even builds requiring backend invocation can be faster than with traditional pip
, especially if the build dependencies themselves are complex or numerous. uv
also caches built wheels (in UV_CACHE_DIR/build
), significantly speeding up subsequent installations of the same source package.
Using uv
in pyproject.toml
-based Projects
For projects defined primarily by pyproject.toml
:
- Environment Creation: Use
uv venv .venv
as usual. - Installing Dependencies:
- If your project's runtime dependencies are listed in
[project].dependencies
inpyproject.toml
, you can often install the project itself (which includes its dependencies) directly: - For editable installs during development:
This uses the build backend specified in
pyproject.toml
to set up the editable install. - For installing optional dependencies (e.g., for development or testing):
- If your project's runtime dependencies are listed in
- Generating Pinned Requirements: You can use
uv pip compile
directly withpyproject.toml
as input (support may depend onuv
version, check documentation):Then use# Generate requirements.txt from pyproject.toml dependencies uv pip compile pyproject.toml -o requirements.txt # Include optional dependencies (e.g., 'dev' group) uv pip compile pyproject.toml --extra dev -o dev-requirements.txt
uv pip sync requirements.txt
for installation. This approach is often preferred for locking application dependencies.
Building Wheels and Sdists with uv
(uv wheel
)
Beyond just installing, uv
provides a command to directly build wheels and source distributions for your own project, leveraging the information in pyproject.toml
.
-
Building a Wheel:
This command performs steps similar to the build process during installation: readspyproject.toml
, sets up an isolated build environment with build dependencies, invokes thebuild_wheel
function of the specified backend, and places the resulting.whl
file in thedist
directory (or the default location if--out-dir
is omitted). -
Building an Sdist:
This provides a fast and convenient way to build your package artifacts for distribution or testing, using the same consistent uv
tooling.
Future uv
Project Management Features (Speculative)
Astral has indicated ambitions for uv
potentially evolving into a more comprehensive project and workflow management tool, possibly incorporating features currently found in tools like Poetry or PDM directly into uv
. This might include more integrated commands for adding/removing dependencies directly to pyproject.toml
, managing project versions, publishing packages, and running scripts, all under the unified, high-performance uv
umbrella. However, as of early 2024, uv
primarily focuses on being an extremely fast installer, environment manager, and pip-tools
replacement. Keep an eye on Astral's announcements for future developments.
Workshop: Building a pyproject.toml
-based Project with uv
Goal: Create a simple Python project using pyproject.toml
with the hatchling
build backend, install it editably using uv
, and build a wheel using uv wheel
.
Scenario: You are starting a new project and want to use modern packaging standards (pyproject.toml
and hatchling
) and manage it with uv
.
Steps:
-
Project Setup:
mkdir uv_project_build cd uv_project_build # Create source directory mkdir src mkdir src/my_proj touch src/my_proj/__init__.py # Add a simple function echo "def main(): print('Hello from uv_project_build!')" > src/my_proj/main.py touch README.md echo "# My UV Project" > README.md touch LICENSE echo "MIT License..." > LICENSE
-
Create
pyproject.toml
:- Define build system requirements (using
hatchling
) and project metadata:# pyproject.toml [build-system] requires = ["hatchling"] build-backend = "hatchling.build" [project] name = "my-proj" version = "0.1.0" description = "A sample project built with uv and hatch." readme = "README.md" requires-python = ">=3.8" license = { file = "LICENSE" } authors = [ { name = "Your Name", email = "your@email.com" } ] dependencies = [ "rich>=13.0" # Add a simple dependency ] # Optional: Define scripts/entry points if needed # [project.scripts] # my-proj-cli = "my_proj.main:main" # Optional: Configure hatch settings if needed # [tool.hatch.version] # path = "src/my_proj/__init__.py" # Example: manage version in __init__.py
- Define build system requirements (using
-
Set up Virtual Environment:
-
Install in Editable Mode:
- Use
uv pip install -e .
to install the current project editably. - Observe the output.
uv
should:- Recognize the
pyproject.toml
. - Install build dependencies (
hatchling
). - Invoke
hatchling
's backend to set up the editable install. - Install runtime dependencies (
rich
).
- Recognize the
- Verify installation:
- Use
-
Build the Wheel:
- Use the
uv wheel
command to build a distributable wheel file. - Observe the output. Again,
uv
will likely usehatchling
under the hood. - Check the contents of the
dist
directory:
- Use the
-
Inspect the Wheel (Optional):
- You can unzip the wheel file (it's just a zip archive) to see its contents:
-
Cleanup:
Outcome: You have successfully created a Python project using pyproject.toml
and the hatchling
backend. You used uv
to install the project in editable mode (demonstrating integration with the build backend) and used uv wheel
to build a distributable wheel package, showcasing uv
's capabilities beyond just consuming packages.
10. Troubleshooting Common Issues
While uv
is generally robust and fast, like any tool, you might occasionally encounter issues. Understanding common problems and how to diagnose them using uv
's features can save you significant time.
Installation Errors
-
Problem:
uv: command not found
after installation.- Cause: The directory where
uv
was installed (~/.local/bin
,~/.cargo/bin
) is not in your shell'sPATH
environment variable, or the shell session hasn't been updated. - Solution:
- Verify the installation directory mentioned during the install process.
- Check your
PATH
:echo $PATH
. - If the directory is missing, add it to your shell configuration file (
~/.bashrc
,~/.zshrc
,~/.profile
, etc.). Example for~/.bashrc
:export PATH="$HOME/.local/bin:$PATH"
. - Reload your shell configuration (
source ~/.bashrc
) or open a new terminal window. - Verify again with
uv --version
.
- Cause: The directory where
-
Problem: Errors during installation via
curl | sh
(e.g., network issues, permissions).- Cause: Network connectivity problems, restrictive firewalls, or insufficient permissions in the target installation directory.
- Solution:
- Check network connection.
- Try downloading the script first (
curl ... -o install.sh
) and inspecting it before running (sh install.sh
). - Ensure you have write permissions in the target directory (e.g.,
~/.local/bin
). - Consider using
pipx
orcargo
if the script method fails consistently.
Package Installation / Dependency Resolution Failures
-
Problem:
uv pip install
fails with errors related to missing compilers or system libraries (e.g.,gcc: command not found
,error: <libxyz.h> not found
,Failed to build <package>
).- Cause: The package being installed (or one of its dependencies) contains C/C++/Rust extensions that need to be compiled, but the required compiler or development headers/libraries are missing on your Linux system.
- Solution:
- Read the error message carefully. It usually indicates which compiler or library is missing.
- Install the necessary system packages. Common ones include:
- For C/C++ extensions:
build-essential
,gcc
,g++
,make
(Debian/Ubuntu) orbase-devel
,gcc
(Arch Linux) or@development-tools
group (Fedora/CentOS). - For specific libraries:
python3-dev
(Debian/Ubuntu) orpython-devel
(Fedora/CentOS) is almost always needed for Python C extensions. Look for-dev
or-devel
packages corresponding to the library mentioned in the error (e.g.,libssl-dev
,libffi-dev
,libxml2-dev
).
- For C/C++ extensions:
- Retry the
uv pip install
command.
-
Problem:
uv pip install
oruv pip compile
fails with a dependency conflict message (e.g.,ResolutionImpossible
,package X requires Y>=1.0 but package Z requires Y<1.0
).- Cause: Your direct or indirect requirements specify incompatible versions of the same package.
- Solution:
- Analyze
uv
's error output. It's usually quite good at explaining which packages have conflicting requirements for which dependency. - Examine your requirements files (
requirements.in
,pyproject.toml
). - Adjust your requirements:
- Can you relax the constraints on one of the conflicting packages? (e.g., change
Y<1.0
toY<1.1
ifX
needsY==1.0
). - Can you upgrade/downgrade the top-level package that's bringing in the problematic constraint?
- Is there a newer version of one of the top-level packages that resolves the conflict?
- Can you relax the constraints on one of the conflicting packages? (e.g., change
- Use
uv pip compile
iteratively: Make changes torequirements.in
and re-runcompile
until the resolution succeeds.
- Analyze
-
Problem: Network errors during package download (
Could not connect to host
,TLS error
).- Cause: Internet connectivity issues, firewalls blocking access to PyPI, VPN problems, incorrect proxy settings, or issues with TLS certificate verification (especially with private indexes or corporate proxies).
- Solution:
- Check basic network connectivity (
ping pypi.org
). - Check firewall/proxy settings. Configure
uv
with proxy details if necessary (often via standard env vars likeHTTP_PROXY
,HTTPS_PROXY
). - For TLS errors with private indexes: Use
--custom-ca /path/to/ca.pem
or theUV_CUSTOM_CA
environment variable if you have a custom CA certificate. - If using a corporate network with TLS interception/inspection, you might need to use
--native-tls
or configure the custom CA. - Try using
--verbose
(-v
) flag withuv
for more detailed network/TLS error messages.
- Check basic network connectivity (
Virtual Environment Issues
-
Problem:
activate
script not found or doesn't work.- Cause: Environment created incorrectly, path is wrong, or using the wrong activation command for your shell.
- Solution:
- Ensure the virtual environment directory (e.g.,
.venv
) exists and looks correct (ls .venv
). - Ensure you are in the correct directory relative to the environment.
- Use the correct activation command:
source .venv/bin/activate
(Bash/Zsh),source .venv/bin/activate.fish
(Fish), etc. Do not just run.venv/bin/activate
directly.
- Ensure the virtual environment directory (e.g.,
-
Problem: Packages installed outside the active virtual environment.
- Cause: The virtual environment was not actually active when
uv pip install
was run. - Solution:
- Always check that your shell prompt indicates the environment is active (e.g.,
(.venv)
) before running install commands. - Run
which python
andwhich uv
(orwhich pip
) to confirm they point inside the.venv/bin
directory. - If inactive, activate it:
source .venv/bin/activate
.
- Always check that your shell prompt indicates the environment is active (e.g.,
- Cause: The virtual environment was not actually active when
Cache-Related Problems
-
Problem: Installation succeeds, but the installed package behaves strangely or crashes, potentially due to a corrupted artifact.
- Cause: A rare cache corruption issue (e.g., incomplete download, disk error).
- Solution:
- Clear the cache for the specific problematic package:
uv cache clean <package_name>
. - Retry the installation (
uv pip install <package_name>
oruv pip sync ...
). - If problems persist, try clearing the entire cache:
uv cache clean
. Then reinstall dependencies.
- Clear the cache for the specific problematic package:
-
Problem: Outdated package metadata seems to be used, preventing installation of a newly released version.
- Cause: The cached index/registry metadata hasn't been updated.
- Solution:
- Try forcing a refresh during installation:
uv pip install --refresh <package_name>
oruv pip install --refresh-package <package_name>
. - As a more forceful step, clear the cache (
uv cache clean
) and try again.
- Try forcing a refresh during installation:
Using Verbose Output for Diagnostics
When troubleshooting, uv
's verbose flags are invaluable:
uv -v pip ...
: Provides more detailed output about steps being taken (e.g., checking cache, downloading, building).uv -vv pip ...
: Even more detailed, often including information about dependency resolution steps, build environment setup, and backend calls.uv -vvv pip ...
: Maximum verbosity, potentially including debug-level information.
Pipe the output to a file (uv -vv pip install ... > output.log 2>&1
) for easier analysis of complex issues.
Workshop: Troubleshooting Simulated Problems
Goal: Practice diagnosing and fixing common issues like missing build dependencies and dependency conflicts using uv
's output and commands.
Scenario 1: Missing Build Dependency
-
Setup:
- Create a new environment:
- Ensure you do not have the primary C compiler and Python development headers installed system-wide (this might be tricky to simulate perfectly if they are already present, but proceed anyway). Common packages to uninstall temporarily (use with caution, maybe on a VM):
sudo apt-get remove build-essential python3-dev
(Debian/Ubuntu) orsudo pacman -R base-devel python
(Arch - be careful!). Or simply proceed and observe the erroruv
gives.
-
Attempt Installation:
- Try installing a package known to require compilation, like
cryptography
orlxml
.
- Try installing a package known to require compilation, like
-
Analyze Error:
- The command will likely fail with errors mentioning
gcc
or<Python.h>
not found. - Read the error messages carefully. Note keywords like
error: command 'gcc' failed
,fatal error: Python.h: No such file or directory
.
- The command will likely fail with errors mentioning
-
Fix:
- Install the missing system dependencies.
- Retry the installation:
- It should now succeed (though it might take time to compile).
-
Cleanup:
Scenario 2: Dependency Conflict
-
Setup:
- Create a new environment:
- Create a
requirements.txt
with conflicting constraints:(Note: Finding real-world conflicts can be tricky; this simulates the idea.# requirements.txt requests==2.25.0 # A library hypothetical==1.0 requires requests>=2.30.0 # We simulate this by adding another direct dependency # that requires a newer requests requests-oauthlib>=1.3.0 # This usually needs newer requests
requests-oauthlib
might not strictly conflict withrequests==2.25.0
in all versions, but it illustrates the point). Let's try a more direct conflict:
-
Attempt Installation / Compilation:
- Try installing or compiling from this file:
-
Analyze Error:
uv
should fail and report aResolutionImpossible
error.- Examine the output. It should state something like:
flask==1.1.0
depends onWerkzeug>=0.15,<2.0
- You requested
werkzeug==2.0.0
- These constraints are incompatible.
-
Fix:
- Edit
requirements.txt
to resolve the conflict. You could either:- Upgrade Flask:
flask>=2.1
(which is compatible with Werkzeug 2.0.0) - Downgrade Werkzeug:
werkzeug<2.0
(and potentially pin Flask toflask==1.1.0
)
- Upgrade Flask:
- Let's try upgrading Flask:
- Retry the command:
- It should now succeed. Examine
resolved.txt
.
- Edit
-
Cleanup:
Outcome: You have practiced identifying and resolving common uv
issues: missing system build dependencies by installing required packages, and dependency conflicts by analyzing uv
's error messages and adjusting requirements files. You also learned the importance of verbose output and cache cleaning in troubleshooting.
Conclusion
Throughout this guide, we've explored uv
, Astral's high-performance Python package installer and virtual environment manager. From basic installation on Linux to advanced concepts like caching, pyproject.toml
integration, and troubleshooting, you've seen how uv
aims to significantly streamline and accelerate common Python development workflows.
Key Takeaways:
- Speed:
uv
's Rust implementation, asynchronous operations, and intelligent caching make it orders of magnitude faster than traditional tools likepip
andvenv
for many operations, especially dependency resolution and installation with a warm cache. - Unified Tooling: It combines package installation (
uv pip ...
) and virtual environment management (uv venv ...
) into a single, cohesive command-line interface. - Compatibility:
uv
works seamlessly with existing standards likerequirements.txt
files andpyproject.toml
, ensuring smooth adoption into current projects. - Modern Features: It incorporates best practices like robust dependency resolution (similar to
pip-tools
,Poetry
,PDM
), hash checking, and efficient environment synchronization (uv pip sync
). - Active Development: Backed by Astral (developers of
Ruff
),uv
is under active development, continuously improving and potentially expanding its scope towards more comprehensive project management in the future.
For Linux users, particularly those working on large projects, in CI/CD pipelines, or simply valuing development efficiency, uv
presents a compelling alternative to the standard Python tooling. Its performance benefits can lead to substantial time savings, while its compatibility ensures it fits well within the established ecosystem.
While uv
is still relatively young compared to pip
, its performance, design, and the reputation of its developers suggest it's a tool worth adopting and watching closely. By mastering its commands and understanding its core concepts like caching and dependency resolution, you can significantly enhance your Python development experience on Linux. We encourage you to integrate uv
into your projects and experience the speed boost firsthand.