Login / Signup

Mixture of Quantized Experts (MoQE): Complementary Effect of Low-bit Quantization and Robustness.

Young Jin KimRaffy FahimHany Hassan Awadalla
Published in: CoRR (2023)
Keyphrases
  • uniform quantization
  • adaptive quantization
  • image segmentation
  • mixture model
  • human experts
  • quantization error
  • transform coefficients
  • vocabulary tree
  • successive approximation