EN  |  DESP

Products

FPGA Based Cards

C5010X Data Center FPGA IPU NIC Intel® based

Dual port SFP28 25G Ethernet PCIe FPGA IPU NIC Intel® based

Silicom C5010X Data Center FPGA IPU NIC Intel® based is an advanced data processing network interface controller, featuring cutting edge I/O and data processing capabilities, optimized for virtualized cloud, cloud native or bare metal virtualization.

Silicom C5010X can be offered with complete implementation of I/O workload using industry standard interfaces, allowing Silicom C5010X to be operated using stock virtio and NVMe drivers, implementing a true hardware virtio and NVMe PCIe interface.

Combining two powerful engines – Intel® Stratix® 10 DX 1100 FPGA and Intel® Xeon® D-1612 SoC (x86 CPU, 4 cores, 8 threads) – Silicom C5010X data processing NIC brings together the power and flexibility of FPGA and the wealth of the x86 ecosystem.

C5010X Data Center FPGA IPU NIC

A unique 3-way path for application optimization guarantees fast results. The presence of a x86 based CPU allows you to leverage existing software assets, where the FPGA flexibly extremely enables performant data path implementation:

1. In the first stage, the card can serve as a standard NIC with minimal configuration changes, with minimal orchestration modification;
2. Second stage include a migration of data plane and VM switching functionality from main host down to x86 CPU cores on card;
3. The third stage involves data path implementation on the FPGA.

Using the offered IP, the Silicom C5010X can be developed and deployed as a virtio network accelerator, a virtio storage accelerator, a NVMe storage (network) controller, a RDMA OFED controller, either separately or all at the same time. Further implementations such as flow filtering, encryption, compression, deduplication and machine learning, are facilitated by C5010X facilitates as complete infrastructure implementation on an NIC.

Data Center FPGA IPU NIC solution

As the data path is implemented on the card, fewer risks and vulnerabilities are left to the cloud data center infrastructure.

Storage data paths in a cloud environment are often associated with the use of a storage attached network (SAN), serving the compute nodes over the network. The use of an x86 CPU core NIC with FPGA facilitates for allows relatively easy implementation of storage volume virtualizations, accessible on the host using stock storage software stacks, such as NVMe.

Use Cases

  • Virtual Cloud
  • Bare Metal Cloud
  • NVFi
  • Secure Infrastructure
  • Cloud Storage
Silicom C5010X Data Center NIC

C5010X Data Center FPGA IPU NIC Intel® based

Dual port SFP28 25G Ethernet PCIe FPGA IPU NIC Intel® based

  • Intel® Xeon® D-1612 @ 1.5GHz 4 core, 8 threads
  • Intel® Stratix® 10 DX 1100
  • virtio-net and virtio-blk over PCIe
  • Dual 25GbE
  • PCIe v4 x 8 (x16 physical)

C5010X Data Center FPGA IPU NIC Intel® based

Dual port SFP28 25G Ethernet PCIe FPGA IPU NIC Intel® based

Network Interface

IEEE standard IEEE 802.3 10GE, 25GE
Interfaces
  • Physical interface: 2 x SFP28 slots
  • Supports SFP+/SFP28 modules with Multimode SR (850nm), single mode LR (1310nm), multimode LRM (1310 nm), or Direct Attached Copper (Twinax) and others
  • Data rate: 2×10, 2×25, Gbps
  • Support for SyncE

Interfaces

Network Dual 25GbE, using SFP28
Host
  • PCIe gen4 x 8 (x16 physical)
  • NCSI RBT
  • Support for SMBUS
SoC
  • PCIe v3 x 8
  • USB NIC
  • UART

General Technical Specifications:

SoC details

Intel® Xeon® D-1612

  • 4 x86 64 bit cores @1.5GHz
  • 8 threads
  • 6MB cache
  • VT-d, VT-x
  • Intel® AVX2
  • AES-NI
  • 16GB DDR ECC
FPGA Details

Intel® Stratix 10 DX 1100

  • Intel® Hyperflex™ core architecture
  • Intel® Embedded Multi-die Interconnect Bridge (EMIB)
  • PCIx Gen4 x16 hard IP, SRIOV
  • Fixed point and IEEE 754 compliant floating-point variable precision digital signal processing (DSP) block
  • Internal memory
    • M20K, 107Mb
    • eSRAM, 47.25Mb
    • MLAB
  • 1,325,000 ALM
  • Quad-core 64-bit Arm® Cortex®-A53 embedded processor @1.5 GHz
  • 4GB DDR
Configuration
  • Configuration flash can be made to support multiple boot images for automatic fallback to fail safe
  • Upload of FPGA configuration to flash via PCIe
  • Direct FPGA configuration via the onboard JTAG dongle
On-board Memory
  • 16GB DDR ECC for SoC
  • 4GB DDR for FPGA
  • User configurable space in flash RAM for permanent storage
  • Configuration flash RAM for boot images
On-board Clock
  • PCIe clock: 100 MHz
  • Core Clock 125Mhz
  • 2 x differential 312.5 MHz SerDes clock for Ethernet
  • 2 x differential 266.67 MHz/300MHz/333.33MHz clock for Memory
  • Calibration clock 125MHz, 100MHz, 25MHz
  • 50 MHz clock
Additional Board Support
  • On-board power and temperature sensors (via SMBus/I2C)
  • FPGA controlled Link and Activity LED for each port. 2 for each SFP28
  • Board status LEDs
  • FPGA Reset via host I2C
Environment
  • Full height, ½ length 111.15 x 167.65 mm with bracket
  • Storage temperature: -30 – 70°C -22 – 158°F
  • Operating temperature (card inlet): 0 – 55°C, 30 – 130°F
  • Operating humidity: 20 – 80%
  • Hardware compliance: RoHS, FCC, CE
Power
  • Max 75W
  • Passive cooling
  • Power and temperature monitoring via SMBus/I2C
Management
  • SoC boot options: PXE, SATA
  • SoC control interfaces: USB, UART, network
Networking
  • A configurable packet processor IP core
  • Extensive configuration API
  • Packet forwarding and bridging across network, main host and SoC
  • Parsing, match and action operations
  • Bandwidth rate limit
3rd party solution support
  • Napatech Link™ Virtualization Software and SmartNIC solution

C5010X Data Center FPGA IPU NIC Intel® based

Dual port SFP28 25G Ethernet PCIe FPGA IPU NIC Intel® based

P/N

Description

FB2XXVG@S10D11-HDNP  

 

Product Brief_C5010X-NIC – Hardware

Product Brief_C5010X-NIC – virtio

1V1

FacebookTwitterLinkedIn
YouTube Channel