Switch® Unveils Living Data Center EVO to Transform AI Infrastructure Efficiency
- Switch's Living Data Center EVO automates AI factory infrastructure, optimizing operational efficiency with real-time automation and 3D visualization.
- The LDC EVO maintains a digital twin for operational support, accommodating high-density AI workloads and hybrid cooling systems.
- LDC EVO enhances deployment experience with NVIDIA integration, streamlining hardware configurations and increasing overall data center efficiency.
Revolutionizing AI Infrastructure: Switch® Launches Living Data Center EVO
In a significant advancement for the AI industry, Switch® introduces its Living Data Center (LDC) EVO, a cutting-edge system designed to automate and optimize the infrastructure essential for AI factories. Announced on March 16, 2026, at a Las Vegas event, Switch integrates NVIDIA Omniverse DSX Blueprint within its EVO AI Factory™ architecture, setting a new standard for operational efficiency. By utilizing NVIDIA's comprehensive libraries and OpenUSD, the LDC EVO facilitates near real-time automation and offers a physics-accurate 3D visualization of each element in the facility. This represents a pivotal shift from traditional data centers that depend on outdated data center infrastructure management (DCIM) systems, which struggle to keep pace with the demanding complexity of AI factory operations.
The LDC EVO maintains an updated digital twin of the AI factory, enabling exceptional operational support and capabilities. Switch’s Chief Technology Officer, Zia Syed, highlights the importance of this digital twin in orchestrating a modular and configurable architecture tailored to the intense requirements of AI workloads. The advanced design accommodates hybrid cooling systems and extreme AI densities, making it ideally suited for high-performance computing environments. In addition, the system ensures that customers deploying NVIDIA's accelerated computing on Dell PowerEdge servers can do so with maximum efficiency and precision, meeting stringent engineering standards.
LDC EVO's integration with NVIDIA's offerings allows customers to validate hardware configurations, including those optimized for NVIDIA Grace Blackwell, prior to deployment. This process minimizes the challenges typically associated with hardware setups, thereby streamlining operations. With its intelligent automation workflows and the support of operational intelligence, LDC EVO enhances the overall deployment experience across Switch’s platforms, marking a transformative step forward in data center efficiency and the operational capabilities of AI technologies.
In other related developments, the push for more advanced infrastructure solutions within the AI landscape indicates a growing trend towards automation and efficiency. With companies increasingly investing in AI capabilities, the introduction of systems like the LDC EVO signifies a pivotal shift toward meeting these rising demands.
As industries target enhanced AI performance, innovations like the LDC EVO offer a glimpse into the future of data centers. The impact of such technologies extends beyond operational efficiency, promising to redefine how companies approach AI deployment and infrastructure management.