Skip to content

Support mx post-training quantization for emerging models#306

Open
jiaeenie wants to merge 6 commits intoDeepWok:mainfrom
jiaeenie:jiaeenie/mx-quant-ptq
Open

Support mx post-training quantization for emerging models#306
jiaeenie wants to merge 6 commits intoDeepWok:mainfrom
jiaeenie:jiaeenie/mx-quant-ptq

Conversation

@jiaeenie
Copy link

@jiaeenie jiaeenie commented Feb 10, 2026

TODOS:

  • Quantizers: Add MXFP, MXINT, and Minifloat fake quantizers
  • GPTQ: Integrate GPTQ into quantization pass
  • QuaRot: Integrate QuaRot into quantization pass
  • Quantized Models: Add MX modules llama for now
  • Documentation: With documentation

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant