Show simple item record

dc.contributor.authorOsorio Ríos, John Haiber
dc.contributor.authorArmejach Sanosa, Adrià
dc.contributor.authorPetit, Eric
dc.contributor.authorHenry, Greg
dc.contributor.authorCasas Guix, Marc
dc.contributor.otherUniversitat Politècnica de Catalunya. Doctorat en Arquitectura de Computadors
dc.contributor.otherUniversitat Politècnica de Catalunya. Departament d'Arquitectura de Computadors
dc.contributor.otherBarcelona Supercomputing Center
dc.date.accessioned2022-09-29T08:12:58Z
dc.date.available2022-09-29T08:12:58Z
dc.date.issued2022-07-01
dc.identifier.citationOsorio, J. [et al.]. A BF16 FMA is all you need for DNN training. "IEEE transactions on emerging topics in computing", 1 Juliol 2022, vol. 10, núm. 3, p. 1302-1314.
dc.identifier.issn2168-6750
dc.identifier.urihttp://hdl.handle.net/2117/373614
dc.description.abstractFused Multiply-Add (FMA) functional units constitute a fundamental hardware component to train Deep Neural Networks (DNNs). Its silicon area grows quadratically with the mantissa bit count of the computer number format, which has motivated the adoption of the BrainFloat16 format (BF16). BF16 features 1 sign, 8 exponent and 7 explicit mantissa bits. Some approaches to train DNNs achieve significant performance benefits by using the BF16 format. However, these approaches must combine BF16 with the standard IEEE 754 Floating-Point 32-bit (FP32) format to achieve state-of-the-art training accuracy, which limits the impact of adopting BF16. This article proposes the first approach able to train complex DNNs entirely using the BF16 format. We propose a new class of FMA operators, FMAbf16 n m, that entirely rely on BF16 FMA hardware instructions and deliver the same accuracy as FP32. FMAbf16 n m operators achieve performance improvements within the 1.28- 1.35X range on ResNet101 with respect to FP32. FMAbf16 n m enables training complex DNNs on simple low-end hardware devices without requiring expensive FP32 FMA functional units.
dc.description.sponsorshipMarc Casas was partially supported by the under Grant RYC-2017-23269 funded by MCIN/AEI/10.13039/501100011033, and by ESF Investing in your future. Adrià Armejach is a Serra Hunter Fellow and has been partially supported by the under Grant IJCI-2017-33945 funded by MCIN/AEI/10.13039/501100011033. John Osorio has been partially supported by the under Grant PRE2019-090406 funded by MCIN/AEI/10.13039/501100011033 and by ESF Investing in your future. This work has been partially supported by Intel under the BSC-Intel collaboration and European Union Horizon 2020 research and innovation programme under Grant 955606 - DEEP-SEA EU project.
dc.format.extent13 p.
dc.language.isoeng
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.subjectÀrees temàtiques de la UPC::Informàtica::Arquitectura de computadors
dc.subject.lcshNeural networks (Computer science)
dc.subject.lcshMachine learning
dc.subject.otherNeural nets
dc.subject.otherReduced precision
dc.subject.otherFMA operators
dc.subject.otherBF16
dc.subject.otherFP32
dc.subject.otherSwamping
dc.subject.otherComputer arithmetic
dc.subject.otherEmulation
dc.subject.otherHardware
dc.titleA BF16 FMA is all you need for DNN training
dc.typeArticle
dc.subject.lemacXarxes neuronals (Informàtica)
dc.subject.lemacAprenentatge automàtic
dc.contributor.groupUniversitat Politècnica de Catalunya. CAP - Grup de Computació d'Altes Prestacions
dc.identifier.doi10.1109/TETC.2022.3187770
dc.description.peerreviewedPeer Reviewed
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/9823406
dc.rights.accessOpen Access
local.identifier.drac34251215
dc.description.versionPostprint (author's final draft)
dc.relation.projectidinfo:eu-repo/grantAgreement/EC/H2020/955606/EU/DEEP – SOFTWARE FOR EXASCALE ARCHITECTURES/DEEP-SEA
local.citation.authorOsorio, J.; Armejach, A.; Petit, E.; Henry, G.; Casas, M.
local.citation.publicationNameIEEE transactions on emerging topics in computing
local.citation.volume10
local.citation.number3
local.citation.startingPage1302
local.citation.endingPage1314


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record