facebook rss twitter

QNAP Introduces Mustang Series Computing Accelerator Cards

Tags: Qnap, Intel (NASDAQ:INTC)

Quick Link: HEXUS.net/qaeavp

Add to My Vault: x

PRESS RELEASE

Taipei, Taiwan, June 14, 2019 – QNAP® Systems, Inc. (QNAP), a leading computing, networking and storage solution innovator, today introduced two computing accelerator cards designed for AI deep learning inference, including the Mustang-V100 (VPU based) and the Mustang-F100 (FPGA based). Users can install these PCIe-based accelerator cards into Intel®-based server/PC or QNAP NAS to tackle the demanding workloads of modern computer vision and AI applications in manufacturing, healthcare, smart retail, video surveillance, and more.

“Computing speed is a major aspect of the efficiency of AI application deployment,” said Dan Lin, Product Manager of QNAP, continuing “While the QNAP Mustang-V100 and Mustang-F100 accelerator cards are optimized for OpenVINO™ architecture and can extend workloads across Intel hardware with maximized performance, they can also be utilized with QNAP’s OpenVINO Workflow Consolidation Tool to fulfill computational acceleration for deep learning inference in the shortest time.”

Both the Mustang-V100 and Mustang-F100 provide economical acceleration solutions for AI inference, and they can also work with the OpenVINO toolkit to optimize inference workloads for image classification and computer vision. The OpenVINO toolkit, developed by Intel, helps to fast-track development of high-performance computer vision and deep learning into vision applications. It includes the Model Optimizer and Inference Engine, and can optimize pre-trained deep learning models (such as Caffe and TensorFlow) into an intermediate representation (IR), and then execute the inference engine across heterogeneous Intel hardware (such as CPU, GPU, FPGA and VPU).

As QNAP NAS evolves to support a wider range of applications (including surveillance, virtualization, and AI), the combination of large storage and PCIe expandability are advantageous for its usage potential in AI. QNAP has developed the OpenVINO Workflow Consolidation Tool (OWCT) that leverages Intel OpenVINO toolkit technology. When used with the QWCT, Intel-based QNAP NAS presents an ideal Inference Server solution to assist organizations in quickly building inference systems. AI developers can deploy trained models on a QNAP NAS for inference, and install the Mustang-V100 or Mustang-F100 accelerator card to achieve optimal performance for running inference.

QNAP NAS now supports Mustang-V100 and Mustang-F100 with the latest version of the QTS 4.4.0 operating system. To view the QNAP NAS models that support QTS 4.4.0, please visit www.qnap.com. To download and install the OWCT app for QNAP NAS, please visit the App Center.

Key specifications

  • Mustang-V100-MX8-R10
    Half-height, eight Intel® Movidius™ Myriad™ X MA2485 VPU, PCI Express Gen2 x4 interface, power consumption lower than 30W
  • Mustang-F100-A10-R10
    Half-height, Intel® Arria® 10 GX1150 FPGA, PCI Express Gen3 x8 interface, power consumption lower than 60W
Mustang series computing accelerator cards

 

About QNAP Systems, Inc.

QNAP Systems, Inc., headquartered in Taipei, Taiwan, provides a comprehensive range of cutting-edge Network-attached Storage (NAS) and video surveillance solutions based on the principles of usability, high security, and flexible scalability. QNAP offers quality NAS products for home and business users, providing solutions for storage, backup/snapshot, virtualization, teamwork, multimedia, and more. QNAP envisions NAS as being more than "simple storage", and has created many NAS-based innovations to encourage users to host and develop Internet of Things, artificial intelligence, and machine learning solutions on their QNAP NAS.