F5 to Supercharge AI Application Delivery for Service Providers and Enterprises with NVIDIA BlueField-3 DPUs

F5

F5 announced the availability of BIG-IP Next for Kubernetes, an innovative AI application delivery and security solution that equips service providers and large enterprises with a centralized control point to accelerate, secure, and streamline data traffic that flows into and out of large-scale AI infrastructures.

The solution harnesses the power of high-performance NVIDIA BlueField-3 DPUs to enhance the efficiency of data center traffic that is critical to large-scale AI deployments. With an integrated view of networking, traffic management, and security, customers will be able to maximize data center resource utilization while achieving optimal AI application performance. This not only improves infrastructure efficiency but also enables faster, more responsive AI inference, ultimately delivering an enhanced AI-driven customer experience.

F5 BIG-IP Next for Kubernetes is a purpose-built solution for Kubernetes environments that has been proven in large-scale telco cloud and 5G infrastructures. With BIG-IP Next for Kubernetes, this technology is now tailored for leading AI use cases such as inference, retrieval-augmented generation (RAG), and seamless data management and storage. The integration with NVIDIA BlueField-3 DPUs minimizes hardware footprint, enables granular multi-tenancy, and optimizes energy consumption while delivering high-performance networking, security, and traffic management.

Also Read: ServiceNow Names Enterprise Software Industry Veteran Amit Zavery as President, CPO, and COO to Lead Product and Engineering

The combination of F5 and NVIDIA technologies allows both mobile and fixed-line telco service providers to ease the transition to cloud-native (Kubernetes) infrastructure, addressing the growing demand for vendors to adapt their functions to a cloud-native network functions (CNFs) model. F5 BIG-IP Next for Kubernetes offloads data-heavy tasks to the BlueField-3 DPUs, freeing up CPU resources for revenue-generating applications. The solution is particularly beneficial at the network edge for virtualized RAN (vRAN) or DAA for MSO, and in the core network for 5G, enabling future potential for 6G.

Designed specifically for high-demand service providers and large-scale infrastructures, F5 BIG-IP Next for Kubernetes:

  • Streamlines delivery of AI services at cloud scale: BIG-IP Next for Kubernetes seamlessly integrates with customers’ front-end networks, significantly reducing latency while delivering high-performance load balancing to handle the immense data demands of multi-billion-parameter AI models and trillions of operations.
  • Enhances control of AI deployments: The solution offers a centralized integration point into modern AI networks with rich observability and fine-grained information. BIG-IP Next for Kubernetes supports multiple L7 protocols beyond HTTP, ensuring enhanced ingress and egress control at very high performance.
  • Protects the new AI landscape: Customers can fully automate the discovery and security of AI training and inference endpoints. BIG-IP Next for Kubernetes also isolates AI applications from targeted threats, bolstering data integrity and sovereignty while addressing the encryption capabilities critical for modern AI environments.

SOURCE: F5