Exploring the Vision Processing Unit as Co-Processor for Inference

Cita com:
hdl:2117/121386
Document typeConference lecture
Defense date2018-08-06
PublisherIEEE
Rights accessOpen Access
All rights reserved. This work is protected by the corresponding intellectual and industrial
property rights. Without prejudice to any existing legal exemptions, reproduction, distribution, public
communication or transformation of this work are prohibited without permission of the copyright holder
Abstract
The success of the exascale supercomputer is largely debated to remain dependent on novel breakthroughs in technology that effectively reduce the power consumption and thermal dissipation requirements. In this work, we consider the integration of co-processors in high-performance computing (HPC) to enable low-power, seamless computation offloading of certain operations. In particular, we explore the so-called Vision Processing Unit (VPU), a highly-parallel vector processor with a power envelope of less than 1W. We evaluate this chip during inference using a pre-trained GoogLeNet convolutional network model and a large image dataset from the ImageNet ILSVRC challenge. Preliminary results indicate that a multi-VPU configuration provides similar performance compared to reference CPU and GPU implementations, while reducing the thermal-design power (TDP) up to 8x in comparison.
CitationRivas-Gomez, S. [et al.]. Exploring the Vision Processing Unit as Co-Processor for Inference. A: "2018 IEEE International Parallel and Distributed Processing Symposium Workshops (IPDPSW)". IEEE, 2018, p. 589-598.
ISBN978-1-5386-5555-9
Publisher versionhttps://ieeexplore.ieee.org/document/8425465/
Collections
Files | Description | Size | Format | View |
---|---|---|---|---|
Exploring the V ... rocessor for Inference.pdf | 3,299Mb | View/Open |