loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Authors: Matteo Cacciola 1 ; 2 ; Antonio Frangioni 3 ; Masoud Asgharian 4 ; Alireza Ghaffari 1 and Vahid Partovi Nia 1

Affiliations: 1 Huawei Noah’s Ark Lab, Montreal Research Centre, 7101 Park Avenue, Montreal, Quebec H3N 1X9, Canada ; 2 Polytechnique Montreal, 2900 Edouard Montpetit Blvd, Montreal, Quebec H3T 1J4, Canada ; 3 Dipartimento di Informatica, Università di Pisa, Largo B. Pontecorvo 3, Pisa, 56127, Italy ; 4 Department of Mathematics and Statistics, McGill University, 805 Sherbrooke Street West, Montreal, H3A 0B9, Quebec, Canada

Keyword(s): Convergence Analysis, Floating Pint Arithmetic, Low-Precision Number Format, Optimization, Quasi-Convex Function, Stochastic Gradient Descent.

Abstract: Deep learning models are dominating almost all artificial intelligence tasks such as vision, text, and speech processing. Stochastic Gradient Descent (SGD) is the main tool for training such models, where the computations are usually performed in single-precision floating-point number format. The convergence of single-precision SGD is normally aligned with the theoretical results of real numbers since they exhibit negligible error. However, the numerical error increases when the computations are performed in low-precision number formats. This provides compelling reasons to study the SGD convergence adapted for low-precision computations. We present both deterministic and stochastic analysis of the SGD algorithm, obtaining bounds that show the effect of number format. Such bounds can provide guidelines as to how SGD convergence is affected when constraints render the possibility of performing high-precision computations remote.

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.145.39.53

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Cacciola, M.; Frangioni, A.; Asgharian, M.; Ghaffari, A. and Partovi Nia, V. (2023). On the Convergence of Stochastic Gradient Descent in Low-Precision Number Formats. In Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - ICPRAM; ISBN 978-989-758-626-2; ISSN 2184-4313, SciTePress, pages 542-549. DOI: 10.5220/0011795500003411

@conference{icpram23,
author={Matteo Cacciola. and Antonio Frangioni. and Masoud Asgharian. and Alireza Ghaffari. and Vahid {Partovi Nia}.},
title={On the Convergence of Stochastic Gradient Descent in Low-Precision Number Formats},
booktitle={Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - ICPRAM},
year={2023},
pages={542-549},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0011795500003411},
isbn={978-989-758-626-2},
issn={2184-4313},
}

TY - CONF

JO - Proceedings of the 12th International Conference on Pattern Recognition Applications and Methods - ICPRAM
TI - On the Convergence of Stochastic Gradient Descent in Low-Precision Number Formats
SN - 978-989-758-626-2
IS - 2184-4313
AU - Cacciola, M.
AU - Frangioni, A.
AU - Asgharian, M.
AU - Ghaffari, A.
AU - Partovi Nia, V.
PY - 2023
SP - 542
EP - 549
DO - 10.5220/0011795500003411
PB - SciTePress