Skip to content

Latest commit

 

History

History
273 lines (177 loc) · 8.02 KB

File metadata and controls

273 lines (177 loc) · 8.02 KB

Python Environments and Packages

This guide explains one of the most important pieces of practical Python knowledge: how Python packages are installed, where they go, why virtual environments matter, and why this course teaches the standard model first before recommending uv for ongoing work.

It is not enough to know a command by memory. You should understand the model underneath it.


What Is a Package?

In everyday Python usage, people often use "package" to mean two slightly different things:

  • an import package: code you import in Python, such as requests, pytest, or flask
  • a distribution package: something installed into an environment from a package index such as PyPI

These are related, but not identical.

For example:

import requests

You import requests as Python code. But to make that possible, you first install a distribution package into an environment.

What Is PyPI?

PyPI stands for the Python Package Index. It is the main public repository of Python packages.

When you run a command like:

python -m pip install requests

or:

uv add requests

your tooling downloads package metadata and files from package indexes, usually PyPI unless you configure something else.

Where Do Installed Packages Actually Go?

Installed third-party packages usually end up inside an environment's site-packages directory.

That means:

  • if you install globally, packages go into your system or user Python installation
  • if you install inside a virtual environment, packages go into that virtual environment instead

That is why the same computer can have different projects using different versions of the same package without conflict.

Why Virtual Environments Matter

A virtual environment is an isolated Python installation for one project.

It solves real problems:

  • Project A can use one package version while Project B uses another.
  • You avoid polluting your system Python with project-specific dependencies.
  • You can recreate a project environment from dependency files instead of guessing what was installed.
  • It becomes much easier to debug "it works on my machine" problems.

Without a virtual environment, installing packages into your base Python can eventually create version conflicts and confusion.

Standard Foundation: venv and pip

The standard Python model is still important to learn first.

Create a virtual environment:

python -m venv .venv

Activate it on Windows:

.venv\Scripts\activate

Activate it on macOS/Linux:

source .venv/bin/activate

Install a package with pip:

python -m pip install requests

Check what is installed:

python -m pip list

Save dependencies:

python -m pip freeze > requirements.txt

This model remains foundational because many tools, tutorials, and deployment workflows still build on it. You should understand it even if you later prefer a smoother tool.

Course Recommendation: Learn the Model Once, Then Use uv

For current Python work in 2025/2026, uv is a strong default recommendation once you already understand the standard model.

Why:

  • very fast dependency resolution and installation
  • cleaner project workflow
  • built-in support for virtual environments, running commands, and dependency management
  • works well with modern pyproject.toml-based projects

Typical uv workflow inside a project directory:

uv venv
uv add requests
uv run python script.py
uv add --dev pytest

uv still relies on the same underlying Python packaging concepts. It does not remove the need to understand environments or dependencies. It gives you a better workflow on top of them.

Useful extra command:

uv sync

That recreates the environment from the project's declared dependencies.

Standard Model vs Recommended Workflow

Think about the course in two layers:

What you are learning Standard Python model Recommended day-to-day workflow
Create an environment python -m venv .venv uv venv
Install a runtime dependency python -m pip install requests uv add requests
Install a dev dependency python -m pip install pytest uv add --dev pytest
Run a script activate env, then python script.py uv run python script.py
Recreate dependencies python -m pip install -r requirements.txt uv sync

The concepts are the same in both columns. The difference is workflow quality.

requirements.txt vs pyproject.toml

Older Python projects often use requirements.txt:

requests==2.33.1
pytest==9.0.2

This is still common and still useful, especially in simpler projects.

Modern Python projects increasingly use pyproject.toml as the main project configuration file. It can define:

  • project metadata
  • runtime dependencies
  • development dependencies
  • build system settings
  • tool configuration

Example:

[project]
name = "example-project"
version = "0.1.0"
requires-python = ">=3.14"
dependencies = [
    "requests",
]

[dependency-groups]
dev = [
    "pytest",
    "ruff",
]

This is a more structured and modern approach than scattering configuration across many separate files.

Runtime Dependencies vs Development Dependencies

Not every package is needed when your program runs in production.

Typical split:

  • runtime dependencies: packages your application actually needs to run, such as requests or flask
  • development dependencies: packages used only for development, testing, linting, formatting, or type checking, such as pytest or ruff

This distinction matters because it keeps production environments smaller and clearer.

When Should You Create a Virtual Environment?

Use one whenever:

  • you install third-party packages for a project
  • you are following a tutorial that uses external libraries
  • you are building a script, project, or API that should be reproducible
  • you want clean separation between projects

You can skip it temporarily when:

  • you are only running tiny single-file exercises using the Python standard library
  • you are still on the very first modules and not installing anything

But once packages enter the picture, a virtual environment should become the default habit.

What Should Be Committed to Git?

Usually commit:

  • pyproject.toml
  • requirements.txt if your project uses it
  • lock files if your workflow expects them
  • your application code and tests

Usually do not commit:

  • .venv/
  • installed packages
  • caches and machine-specific environment folders

Your environment should be recreated, not copied.

A Simple Mental Model

Think of Python project setup like this:

  1. Python gives you an interpreter.
  2. A virtual environment gives one project its own isolated Python space.
  3. A package manager installs dependencies into that space.
  4. A project file such as pyproject.toml records what that project depends on.

If you understand that model, the individual tools become much easier to learn.

Recommended Course Habit

For this course:

  • early modules: focus on Python itself
  • when you first learn packaging: use venv + python -m pip so you understand the model
  • once you start installing packages and running projects regularly: use uv as the normal workflow
  • keep the underlying model in your head even when the commands change

The key is understanding why the environment exists and keeping project dependencies isolated. uv is the recommended convenience layer, not a replacement for understanding.

Further Reading