Skip to content

DataBooth/fasterai-mojo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

fasterai-mojo

Mixed Python/Mojo project for experimenting with Mojo kernels in a fast.ai / PyTorch training loop, roughly following the "faster-ai-mojo" specification.

Environment

This project uses pixi to manage both Python and Mojo (via the Modular "max-nightly" channel).

  1. Install pixi.

  2. From this directory, create the environment:

    pixi install
  3. Open a shell in the environment:

    pixi shell
  4. (Optional) Check Mojo is available:

    pixi run mojo-version

If you have built a operations.mojopkg containing the Mojo dense kernels exported for Python, ensure it is importable as import operations inside the pixi environment (for example by installing it or adjusting PYTHONPATH).

Layout

  • mojo/dense.mojo – Mojo implementation of dense forward/backward kernels.
  • python/simple_mlp.py – Baseline PyTorch MLP.
  • python/mojo_mlp.py – MLP that is intended to call Mojo kernels via MojoDense.
  • python/data.py – fast.ai MNIST DataLoaders helper.
  • python/experiments/gradient_check.py – Autograd gradient check.
  • python/experiments/performance_compare.pytorch.profiler comparison.
  • python/experiments/train_compare.py – Full training + accuracy comparison.

Running experiments

All commands below assume you are inside pixi shell.

  • Gradient check:

    pixi run gradient-check
  • Performance comparison (emits mojo_trace.json and simple_trace.json):

    pixi run performance
  • Training + accuracy comparison on MNIST:

    pixi run train

At present, the Python side falls back to pure PyTorch math if the Mojo operations module is not available. To connect the real Mojo kernels, wire up operations.dense_forward and operations.dense_backward inside python/mojo_mlp.py using the appropriate MAX tensor bridge API for your installation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •