reading list for quantization
- (NeurIPS2021) Post-Training Quantization for Vision Transformer
- (IJCAI2022) FQ-ViT: Post-Training Quantization for Fully Quantized Vision Transformer
- (ECCV2022) PTQ4ViT: Post-Training Quantization for Vision Transformers with Twin Uniform Quantization
- (ACM MM2022) Towards Accurate Post-Training Quantization for Vision Transformer
- (ICASSAP2023) TSPTQ-ViT: TWO-SCALED POST-TRAINING QUANTIZATION FOR VISION TRANSFORMER
- (ICML2020) Up or Down? Adaptive Rounding for Post-Training Quantization
- (ICLR 2021) BRECQ: PUSHING THE LIMIT OF POST-TRAINING QUANTIZATION BY BLOCK RECONSTRUCTION
- (NeurIPS2022) Q-ViT: Accurate and Fully Quantized Low-bit Vision Transformer
- Q-ViT: Fully Differentiable Quantization for Vision Transformer
- (TPAMI2023) Quantformer: Learning Extremely Low-Precision Vision Transformers
- (AAAI2023) Quantized Feature Distillation for Network Quantization
- (ICCV2023) I-ViT: Integer-only Quantization for Efficient Vision Transformer Inference