Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Akamai Technologies, Inc. (NASDAQ:AKAM) today launched Akamai Inference Cloud, a platform that redefines where and how AI is used by expanding inference from core data centres to the edge of the internet. Akamai Inference Cloud enables intelligent, agentic AI inference at the edge, close to users and devices. Unlike traditional systems this platform is purpose-built to […]
Posted: Wednesday, Oct 29
  • KBI.Media
  • $
  • Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Akamai Technologies, Inc. (NASDAQ:AKAM) today launched Akamai Inference Cloud, a platform that redefines where and how AI is used by expanding inference from core data centres to the edge of the internet.
Akamai Inference Cloud enables intelligent, agentic AI inference at the edge, close to users and devices. Unlike traditional systems this platform is purpose-built to provide low-latency, real-time edge AI processing on a global scale. This launch of Akamai Inference Cloud leverages Akamai’s expertise in globally distributed architectures and NVIDIA Blackwell AI infrastructure to radically rethink and extend the accelerated computing needed to unlock AI’s true potential.
The next generation of AI applications, from personalised digital experiences and smart agents to real-time decision systems demand that AI inference be pushed closer to the user, providing instant engagement where they interact, and making smart decisions about where to route requests. Agentic workloads increasingly require low-latency inference, local context, and the ability to scale globally in an instant. Built to power this transformation, Akamai Inference Cloud is a distributed, generative edge platform that places the NVIDIA AI stack closer to where data is created and decisions need to be made.
Dr Tom Leighton, CEO and Co-Founder of Akamai

Dr Tom Leighton, CEO and Co-Founder of Akamai

“The next wave of AI requires the same proximity to users that allowed the internet to scale to become the pervasive global platform that it is today,” said Dr. Tom Leighton, Akamai CEO and co-founder. “Akamai solved this challenge before – and we’re doing it again. Powered by NVIDIA AI infrastructure, Akamai Inference Cloud will meet the intensifying demand to scale AI inference capacity and performance by putting AI’s decision-making in thousands of locations around the world, enabling faster, smarter, and more secure responses.”
Jensen Huang, Co-founder and CEO of Nvidia

Jensen Huang, Co-founder and CEO of Nvidia

“Inference has become the most compute-intensive phase of AI — demanding real-time reasoning at planetary scale,” said Jensen Huang, founder and CEO, NVIDIA. “Together, NVIDIA and Akamai are moving inference closer to users everywhere, delivering faster, more scalable generative AI and unlocking the next generation of intelligent applications.”
Akamai Inference Cloud redefines where and how AI is used by bringing intelligent, agentic AI inference close to users and devices. The platform combines NVIDIA RTX PRO Servers, featuring NVIDIA RTX PRO 6000 Blackwell Server Edition GPUsNVIDIA BlueField-3 DPUs, and NVIDIA AI Enterprise software with Akamai’s distributed cloud computing infrastructure and global edge network, which has over 4,200 locations worldwide. Akamai Inference Cloud will drive Akamai’s vision for highly scalable, distributed AI performance worldwide by leveraging NVIDIA’s latest technologies — including the recently announced NVIDIA BlueField-4 DPU — to further accelerate and secure data access and AI inference workloads from core to edge.
Akamai has teamed up with NVIDIA to boldly bring inference where inference has never gone before, charting new AI frontiers with Akamai Inference Cloud:
  • Extending enterprise AI Factories to the edge to enable smart commerce agents and personalised digital experiences – AI Factories are powerhouses that orchestrate the AI lifecycle from data ingestion to creating intelligence at scale. Akamai Inference Cloud extends AI Factories to the edge, decentralising data and processing and routing requests to the best model using Akamai’s massively distributed edge locations. This will enable smart agents to adapt instantly to user location, behaviour and intent, and act autonomously to negotiate, purchase, and optimise transactions in real time.
  • Enabling Streaming Inference and Agents to provide instant financial insights and perform real-time decisioning – AI agents require multiple sequential inferences to complete complex tasks, creating delays that erode user engagement. Agentic AI workflows require several inference calls, and if each call creates a network delay, it makes the experience feel sluggish or too slow to meet machine-to-machine latency requirements. Akamai Inference Cloud’s edge-native architecture delivers virtually instant responses, enabling AI agents to operate with human-like responsiveness across multi-step workflows. This can be useful in detecting fraud, accelerating secure payments, and enabling high-speed decisions for industrial edges.
  • Enabling Real-Time Physical AI to operate beyond human-level responsiveness– Physical AI systems like autonomous vehicles, industrial robots, and smart city infrastructure require millisecond-precision decision-making to safely interact with the physical world. Akamai Inference Cloud is designed to enable physical AI to process sensor data, make safety decisions, and coordinate actions at the speed of the physical world—helping transform everything from factory floors and delivery drones to surgical robots and autonomous transportation networks into responsive, intelligent systems that can operate safely alongside humans.
  • Accelerating Time to Value – Orchestrating complex, distributed AI workloads across multiple cloud regions requires specialised skills and teams. Akamai Inference Cloud’s intelligent orchestration layer automatically routes AI tasks to optimal locations—routine inference executes instantly at the edge through NVIDIA’s NIM microservices, while sophisticated reasoning leverages centralised AI factories, all managed through a unified platform that abstracts away infrastructure complexity.
Akamai Inference Cloud is available, targeting 20 initial locations around the globe with plans for an expanded rollout underway.
Share This