This guide explains one of the most important pieces of practical Python knowledge: how Python packages are installed, where they go, why virtual environments matter, and why this course teaches the standard model first before recommending uv for ongoing work.
It is not enough to know a command by memory. You should understand the model underneath it.
In everyday Python usage, people often use "package" to mean two slightly different things:
- an import package: code you import in Python, such as
requests,pytest, orflask - a distribution package: something installed into an environment from a package index such as PyPI
These are related, but not identical.
For example:
import requestsYou import requests as Python code. But to make that possible, you first install a distribution package into an environment.
PyPI stands for the Python Package Index. It is the main public repository of Python packages.
When you run a command like:
python -m pip install requestsor:
uv add requestsyour tooling downloads package metadata and files from package indexes, usually PyPI unless you configure something else.
Installed third-party packages usually end up inside an environment's site-packages directory.
That means:
- if you install globally, packages go into your system or user Python installation
- if you install inside a virtual environment, packages go into that virtual environment instead
That is why the same computer can have different projects using different versions of the same package without conflict.
A virtual environment is an isolated Python installation for one project.
It solves real problems:
- Project A can use one package version while Project B uses another.
- You avoid polluting your system Python with project-specific dependencies.
- You can recreate a project environment from dependency files instead of guessing what was installed.
- It becomes much easier to debug "it works on my machine" problems.
Without a virtual environment, installing packages into your base Python can eventually create version conflicts and confusion.
The standard Python model is still important to learn first.
Create a virtual environment:
python -m venv .venvActivate it on Windows:
.venv\Scripts\activateActivate it on macOS/Linux:
source .venv/bin/activateInstall a package with pip:
python -m pip install requestsCheck what is installed:
python -m pip listSave dependencies:
python -m pip freeze > requirements.txtThis model remains foundational because many tools, tutorials, and deployment workflows still build on it. You should understand it even if you later prefer a smoother tool.
For current Python work in 2025/2026, uv is a strong default recommendation once you already understand the standard model.
Why:
- very fast dependency resolution and installation
- cleaner project workflow
- built-in support for virtual environments, running commands, and dependency management
- works well with modern
pyproject.toml-based projects
Typical uv workflow inside a project directory:
uv venvuv add requestsuv run python script.pyuv add --dev pytestuv still relies on the same underlying Python packaging concepts. It does not remove the need to understand environments or dependencies. It gives you a better workflow on top of them.
Useful extra command:
uv syncThat recreates the environment from the project's declared dependencies.
Think about the course in two layers:
| What you are learning | Standard Python model | Recommended day-to-day workflow |
|---|---|---|
| Create an environment | python -m venv .venv |
uv venv |
| Install a runtime dependency | python -m pip install requests |
uv add requests |
| Install a dev dependency | python -m pip install pytest |
uv add --dev pytest |
| Run a script | activate env, then python script.py |
uv run python script.py |
| Recreate dependencies | python -m pip install -r requirements.txt |
uv sync |
The concepts are the same in both columns. The difference is workflow quality.
Older Python projects often use requirements.txt:
requests==2.33.1
pytest==9.0.2This is still common and still useful, especially in simpler projects.
Modern Python projects increasingly use pyproject.toml as the main project configuration file. It can define:
- project metadata
- runtime dependencies
- development dependencies
- build system settings
- tool configuration
Example:
[project]
name = "example-project"
version = "0.1.0"
requires-python = ">=3.14"
dependencies = [
"requests",
]
[dependency-groups]
dev = [
"pytest",
"ruff",
]This is a more structured and modern approach than scattering configuration across many separate files.
Not every package is needed when your program runs in production.
Typical split:
- runtime dependencies: packages your application actually needs to run, such as
requestsorflask - development dependencies: packages used only for development, testing, linting, formatting, or type checking, such as
pytestorruff
This distinction matters because it keeps production environments smaller and clearer.
Use one whenever:
- you install third-party packages for a project
- you are following a tutorial that uses external libraries
- you are building a script, project, or API that should be reproducible
- you want clean separation between projects
You can skip it temporarily when:
- you are only running tiny single-file exercises using the Python standard library
- you are still on the very first modules and not installing anything
But once packages enter the picture, a virtual environment should become the default habit.
Usually commit:
pyproject.tomlrequirements.txtif your project uses it- lock files if your workflow expects them
- your application code and tests
Usually do not commit:
.venv/- installed packages
- caches and machine-specific environment folders
Your environment should be recreated, not copied.
Think of Python project setup like this:
- Python gives you an interpreter.
- A virtual environment gives one project its own isolated Python space.
- A package manager installs dependencies into that space.
- A project file such as
pyproject.tomlrecords what that project depends on.
If you understand that model, the individual tools become much easier to learn.
For this course:
- early modules: focus on Python itself
- when you first learn packaging: use
venv+python -m pipso you understand the model - once you start installing packages and running projects regularly: use
uvas the normal workflow - keep the underlying model in your head even when the commands change
The key is understanding why the environment exists and keeping project dependencies isolated. uv is the recommended convenience layer, not a replacement for understanding.