Skip to content

Open4bits/gpt-oss-120b-mlx-2Bit

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

Alt text for the image

Model Hosting

This repository does not contain the model weight files.

The 2-bit MLX quantized weights for GPT-OSS-120B are hosted on Hugging Face and can be accessed here:

https://huggingface.co/Open4bits/gpt-oss-120b-mlx-2Bit

Model Size: ~61GB Please ensure sufficient storage and bandwidth before downloading.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published