•Operating Systems
Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit, Windows 10 64bit.
•OpenVINO™ Toolkit
◦Intel® Deep Learning Deployment Toolkit
- Model Optimizer
- Inference Engine
◦Optimized computer vision libraries
◦Intel® Media SDK
◦*OpenCL™ graphics drivers and runtimes.
◦Current Supported Topologies: AlexNet, GoogleNetV1/V2, MobileNet SSD, MobileNetV1/V2, MTCNN, Squeezenet1.0/1.1, Tiny Yolo V1 & V2, Yolo V2, ResNet-18/50/101
- For more topologies support information please refer to Intel® OpenVINO™ Toolkit official website.
•High flexibility, Mustang-V100-MX8 develop on OpenVINO™ toolkit structure which allows trained data such as Caffe, TensorFlow, and MXNet to execute on it after convert to optimized IR.
*OpenCL™ is the trademark of Apple Inc. used by permission by Khronos
Model Name | Mustang-V100-MX8 |
---|---|
Main Chip | Eight Intel® Movidius™ Myriad™ X MA2485 VPU |
Operating Systems | Ubuntu 16.04.3 LTS 64bit, CentOS 7.4 64bit, Windows 10 64bit |
Dataplane Interface | PCI Express x4 |
Compliant with PCI Express Specification V2.0 | |
Power Consumption | Approximate 25W |
Operating Temperature | -20°C~60°C |
Cooling | Active fan |
Dimensions | Half-Height, Half-Length, Single-width PCIe |
Operating Humidity | 5% ~ 90% |
Power Connector | *Preserved PCIe 6-pin 12V external power |
Dip Switch/LED indicator | Identify card number |
Part No. | Description |
---|---|
Mustang-V100-MX8-R11 | Computing Accelerator Card with 8 x Movidius Myriad X MA2485 VPU, PCIe Gen2 x4 interface, RoHS |