ASUS Debuts NVIDIA MGX-Powered Servers To Boost AI Applications
ASUS has officially declared its participation in the NVIDIA GTC global AI conference, highlighting its latest advancements in GPU server solutions. The tech giant is set to exhibit its ESC NM1-E1 and ESC NM2-E1 servers, which are at the forefront of leveraging NVIDIA's MGX modular reference architecture. These servers are equipped with NVIDIA's cutting-edge technologies, including the B200 Tensor Core GPU, the GB200 Grace Blackwell Superchip, and H200 NVL, aiming to provide optimized AI server solutions across various industries.
The company's portfolio spans a broad spectrum of servers, catering to needs ranging from entry-level to high-end GPU server solutions, specifically designed to support generative AI environments. ASUS is also introducing liquid-cooled rack solutions tailored to diverse workloads. By utilizing its MLPerf expertise, ASUS enhances both hardware and software for large-language-model (LLM) training and inferencing, ensuring seamless integration of total AI solutions.

The ESC NM1-E1 and ESC NM2-E1 servers are powered by the NVIDIA GH200 Grace Hopper Superchip, delivering exceptional performance and efficiency. Integration with NVIDIA BlueField-3 DPUs and ConnectX-7 network adapters enables these servers to achieve a data throughput of 400Gb/s. Designed for enterprise AI development and deployment, these servers support NVIDIA AI Enterprise, a comprehensive cloud-native software platform for constructing and deploying AI applications.
ASUS is also pioneering in advanced server-cooling technology. The company offers direct-to-chip (D2C) cooling solutions that can be quickly implemented to reduce power usage in data centers. Its servers are compatible with manifolds and cool plates and can be fitted with a rear-door heat exchanger for efficient liquid cooling. Through collaboration with cooling solution providers, ASUS aims to decrease data center power consumption, carbon emissions, and overall energy use.
In the realm of software solutions, ASUS is showcasing the ESC4000A-E12 server at GTC, which features a no-code AI platform with an integrated software stack. This platform is designed to expedite AI development processes including LLM pre-training, fine-tuning, and inference. ASUS's comprehensive solutions also support various LLM tokens and optimize GPU resource allocation for efficient AI training.
Furthermore, ASUS is working closely with industrial partners, software experts, and integrators to bolster enterprise IoT applications. The company's goal is to offer turnkey server support for complete solutions that include installation and testing services.
ASUS servers are available for purchase globally. For more information on ASUS's innovative GPU server solutions or to get in touch with a local ASUS representative, interested parties are encouraged to visit the ASUS website.