This repository does not contain the model weight files.
The 2-bit MLX quantized weights for DeepSeek-R1 are hosted on Hugging Face and can be accessed here:
https://huggingface.co/Open4bits/DeepSeek-R1-mlx-2Bit
Model Size: ~210GB Please ensure you have sufficient storage space and a stable high-bandwidth connection before downloading.