June 20, 2025
Introducing the AMD Instinct MI350 Series GPUs
The AMD Instinct™ MI350 Series GPUs set a new standard for Generative AI and high-performance computing (HPC) in data centers
Leadership AI & HPC Acceleration
Built on the new cutting-edge 4th Gen AMD CDNA™ architecture, the AMD MI350X and MI355X deliver exceptional efficiency and performance for training massive AI models, high-speed inference, and complex HPC workloads like scientific simulations, data processing and computational modeling.
Under the Hood
The Ultimate AI and HPC Performance - AMD Instinct™ MI350 Series GPUs feature powerful and energy-efficient cores, maximizing performance per watt to drive the next era of AI and HPC innovation.
Benefits
Breakthrough AI Acceleration With Huge Memory - The AMD Instinct™ MI350 Series GPUs redefine AI acceleration with next-gen FP6 and FP4 datatype support (Matrix sparsity included), optimizing efficiency, bandwidth, and energy use for lightning-fast AI inference and training.
Designed to fuel performance of the most demanding AI models, Instinct MI350 GPUs boast a massive 288GB of HBM3E memory and 8TB/s bandwidth, delivering a huge leap in performance over previous generations.
Advanced Security for AI & HPC - AMD Instinct™ MI350 Series GPUs help ensure trusted firmware, verify hardware integrity, enable secure multi-tenant GPU sharing, and encrypt GPU communication—helping enhance reliability, scalability, and data security for cloud AI and mission-critical workloads.
Seamless Deployment & AI Optimization - AMD Instinct™ MI350 Series GPUs help enable frictionless adoption with drop-in compatibility, while the AMD GPU Operator simplifies deployment and workload configuration in Kubernetes. Powered by the open AMD ROCm™ software stack, developers get Day 0 support for leading AI frameworks and models from OpenAI, Meta, PyTorch, Hugging Face, and more— helping ensure efficient, high-performance execution without vendor lock-in.
Trusted by AI Leaders - Industry leaders and innovators trust AMD Instinct™ GPUs for large-scale AI, powering models like Llama 405B and GPT. Broad AMD Instinct GPU adoption by CSPs and OEMs is helping to drive next-gen AI at scale.