Rethinking Post-Training Quantization: Introducing a Statistical Pre-Calibration Approach
Alireza Ghaffari, Sharareh Younesian, Boxing Chen, Vahid Partovi Nia, Masoud Asgharian
2025
Abstract
As Large Language Models (LLMs) become increasingly computationally complex, developing efficient deployment strategies, such as quantization, becomes crucial. State-of-the-art Post-training Quantization (PTQ) techniques often rely on calibration processes to maintain the accuracy of these models. However, while these calibration techniques can enhance performance in certain domains, they may not be as effective in others. This paper aims to draw attention to robust statistical approaches that can mitigate such issues. We propose a weight-adaptive PTQ method that can be considered a precursor to calibration-based PTQ methods, guiding the quantization process to preserve the distribution of weights by minimizing the Kullback-Leibler divergence between the quantized weights and the originally trained weights. This minimization ensures that the quantized model retains the Shannon information content of the original model to a great extent, guaranteeing robust and efficient deployment across many tasks. As such, our proposed approach can perform on par with most common calibration-based PTQ methods, establishing a new pre-calibration step for further adjusting the quantized weights with calibration. We show that our pre-calibration results achieve the same accuracy as some existing calibration-based PTQ methods on various LLMs.
DownloadPaper Citation
in Harvard Style
Ghaffari A., Younesian S., Chen B., Nia V. and Asgharian M. (2025). Rethinking Post-Training Quantization: Introducing a Statistical Pre-Calibration Approach. In Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM; ISBN 978-989-758-730-6, SciTePress, pages 159-169. DOI: 10.5220/0013348800003905
in Bibtex Style
@conference{icpram25,
author={Alireza Ghaffari and Sharareh Younesian and Boxing Chen and Vahid Nia and Masoud Asgharian},
title={Rethinking Post-Training Quantization: Introducing a Statistical Pre-Calibration Approach},
booktitle={Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM},
year={2025},
pages={159-169},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0013348800003905},
isbn={978-989-758-730-6},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 14th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM
TI - Rethinking Post-Training Quantization: Introducing a Statistical Pre-Calibration Approach
SN - 978-989-758-730-6
AU - Ghaffari A.
AU - Younesian S.
AU - Chen B.
AU - Nia V.
AU - Asgharian M.
PY - 2025
SP - 159
EP - 169
DO - 10.5220/0013348800003905
PB - SciTePress