Skip to content

diohabara/pychd

Repository files navigation

PyChD

CI PyPI Version

LLM-powered Python bytecode decompiler. Uses litellm to support OpenAI, Anthropic, Google, Ollama, and other providers. Handles .pyc files from any Python version via xdis.

Prerequisites

Quick start

just build          # build the Docker image
just lint           # run linters (ruff, pyrefly)
just test           # run tests
just ci             # lint + test
just shell          # interactive shell inside the container

Install (pip)

pip install pychd

Usage

Compile

pychd compile <directory | file>
just compile example/python/01_example_variables.py

Decompile

pychd decompile <pyc-file> [-m MODEL] [-o OUTPUT]
# Default model (ollama/deepseek-r1)
just decompile example/__pycache__/01_example_variables.cpython-314.pyc

# Specify a model
just decompile example/__pycache__/01_example_variables.cpython-314.pyc gpt-4o

Supported -m values include any model supported by litellm:

Provider Example
OpenAI gpt-4o
Anthropic claude-sonnet-4-20250514
Google gemini/gemini-2.0-flash
Ollama (local) ollama/deepseek-r1, ollama/llama3

Large disassembly is automatically split into token-safe chunks when it exceeds the model's context window.

Validate

Compare original source against decompiled output using AST comparison:

pychd validate <original> <decompiled> [-v]
just validate example/python/ example/decompiled/

Development

All development tasks run inside Docker via just. No local Python installation is required.

just fix            # auto-fix lint issues
just test           # run pytest
just shell          # drop into the container

Examples

Example Python source files are in example/python/. Pre-generated decompiled output is in example/decompiled/.

About

PyChD: The ChatGPT-powered decompiler for Python, providing superior code analysis capabilities

Topics

Resources

License

Stars

Watchers

Forks

Contributors