Towards Scalable Analytics with Inference-Enabled Solid-State Drives

Minsub Kim, Jaeha Kung, Sungjin Lee

Research output: Contribution to journalArticlepeer-review

4 Scopus citations

Abstract

In this paper, we propose a novel storage architecture, called an Inference-Enabled SSD (IESSD), which employs FPGA-based DNN inference accelerators inside an SSD. IESSD is capable of performing DNN operations inside an SSD, avoiding frequent data movements between application servers and data storage. This boosts up analytics performance of DNN applications. Moreover, by placing accelerators near data within an SSD, IESSD delivers scalable analytics performance which improves with the amount of data to analyze. To evaluate its effectiveness, we implement an FPGA-based proof-of-concept prototype of IESSD and carry out a case study with an image tagging (classification) application. Our preliminary results show that IESSD exhibits 1.81× better performance, achieving 5.31× lower power consumption, over a conventional system with GPU accelerators.

Original languageEnglish
Article number8770061
Pages (from-to)13-17
Number of pages5
JournalIEEE Computer Architecture Letters
Volume19
Issue number1
DOIs
StatePublished - 1 Jan 2020

Bibliographical note

Publisher Copyright:
© 2002-2011 IEEE.

Keywords

  • Solid-state drives
  • convolutional neural networks
  • deep neural networks
  • in-storage processing

Fingerprint

Dive into the research topics of 'Towards Scalable Analytics with Inference-Enabled Solid-State Drives'. Together they form a unique fingerprint.

Cite this