The NVIDIA HGX B200 is a Blackwell-architecture data center GPU designed for large-scale AI inference and training. NVIDIA HGX B200 servers provide 8 GPUs per node connected via NVSwitch 5.0.
Deploy your first model on NVIDIA HGX B200 GPUs and verify the setup end-to-end.
Set up an NVIDIA HGX B200 instance for LLM inference with vLLM.