Now showing items 1-3 of 3

    • A BF16 FMA is all you need for DNN training 

      Osorio Ríos, John Haiber; Armejach Sanosa, Adrià; Petit, Eric; Henry, Greg; Casas Guix, Marc (Institute of Electrical and Electronics Engineers (IEEE), 2022-07-01)
      Article
      Open Access
      Fused Multiply-Add (FMA) functional units constitute a fundamental hardware component to train Deep Neural Networks (DNNs). Its silicon area grows quadratically with the mantissa bit count of the computer number format, ...
    • Dynamically adapting floating-point precision to accelerate deep neural network training 

      Osorio Ríos, John Haiber; Armejach Sanosa, Adrià; Petit, Eric; Henry, Greg; Casas Guix, Marc (Institute of Electrical and Electronics Engineers (IEEE), 2021)
      Conference report
      Open Access
      Mixed-precision (MP) arithmetic combining both single- and half-precision operands has been successfully applied to train deep neural networks. Despite its advantages in terms of reducing the need for key resources like ...
    • FASE: A fast, accurate and seamless emulator for custom numerical formats 

      Osorio Ríos, John Haiber; Armejach Sanosa, Adrià; Petit, Eric; Henry, Greg; Casas Guix, Marc (Institute of Electrical and Electronics Engineers (IEEE), 2022)
      Conference report
      Open Access
      Deep Neural Networks (DNNs) have become ubiquitous in a wide range of application domains. Despite their success, training DNNs is an expensive task that has motivated the use of reduced numerical precision formats to ...