🌟 NVIDIA Mellanox® MCP7Y10-N001 | 800Gb/s to 2×400Gb/s Passive Breakout Cable (DAC) | OSFP to 2×QSFP112 | 1m | InfiniBand NDR
Product Overview
The NVIDIA Mellanox® MCP7Y10-N001 (P/N: 980-9I928-00N001) is a high-performance passive copper breakout cable designed for InfiniBand NDR infrastructures.
It connects a single 800Gb/s OSFP NDR port to two independent 400Gb/s QSFP112 ports, providing a reliable, low-latency solution for short-range interconnects in modern AI, HPC, and data center environments.With a 1-meter 28AWG LSZH construction, this cable offers excellent flexibility, easy installation, and optimal signal integrity, making it an ideal choice for in-rack deployments where performance and safety standards are critical.
Key Features
• True NDR Breakout: Converts 1×800Gb/s OSFP into 2×400Gb/s QSFP112 channels, maintaining full InfiniBand NDR compatibility.
• Passive Zero-Power Design: No active components, no power consumption, and minimal added latency.
• Short-Reach Optimization: 1-meter length ensures high signal quality, reduced loss, and efficient rack-level cabling.
• Data Center Compliance: LSZH jacket minimizes smoke and halogen emissions, meeting hyperscale safety requirements.
• Flexible Routing: 28AWG conductors allow easy routing compared to longer or thicker NDR passive cables.Technical Specifications
• Model: MCP7Y10-N001
• NVIDIA Part Number: 980-9I928-00N001
• Type: Passive Copper Breakout Cable (DAC)
• Data Rate: 800Gb/s → 2×400Gb/s
• Protocol: InfiniBand NDR (not compatible with Ethernet 400G QSFP-DD)
• Connectivity: OSFP (upstream) to 2× QSFP112 (downstream)
• Cable Length: 1 meter
• Conductor Gauge: 28AWG
• Jacket Material: LSZH
• Pull Tab: Black
• Condition: NEWApplication Scenarios
This breakout cable is commonly used in:
• AI & Machine Learning clusters requiring multi-node connectivity
• HPC environments using NDR-enabled Quantum-2 switches
• Hyperscale server deployments with QSFP112 NICs such as ConnectX-7
• In-rack switch-to-NIC wiring, where low latency and power efficiency are essentialPerformance & Compatibility
The MCP7Y10-N001 is engineered specifically for InfiniBand NDR signalling.
It is compatible with:
• NVIDIA Mellanox Quantum-2 switches (OSFP NDR)
• NVIDIA ConnectX-7 QSFP112 adapters (400Gb/s)
• NDR-capable HPC and AI compute nodesNot compatible with:
• QSFP-DD Ethernet switches (400GbE)
• QSFP56 HDR adapters
• Any non-NDR InfiniBand architectureQuality Assurance
Each unit undergoes full electrical testing, signal-integrity validation, and compatibility verification with NVIDIA Mellanox hardware.
Cables are inspected, professionally packaged, and guaranteed to perform to enterprise and hyperscale standards.Payment & Shipping
• Payment options: PayPal, credit cards, bank transfers
• International shipping (8–13 days) with tracking and secure packaging
• Returns accepted within policy; buyer covers return shippingWhy Purchase from T.E.S IT-Solutions?
T.E.S IT-Solutions specializes in certified NVIDIA Mellanox networking equipment, providing reliable, data-center-grade components for AI, HPC, and cloud environments.
Our expertise ensures seamless integration, dependable performance, and trusted support for mission-critical deployments.
top of page
SKU: MCP7Y10-N001_New
€700.00Price
bottom of page

