Adding guardrails to kagent with kgateway AI gateway
AI agents are quickly becoming integral to enterprise operations, automating tasks ranging from customer support to infrastructure management and SRE. As organizations deploy these agents at scale, ensuring secure, observable, and compliant interactions with Large Language Models (LLMs) becomes paramount. This is where the combination of two open-source projects kagent and kgateway offers a robust solution.
kagent serves as a Kubernetes-native framework that enables DevOps and platform engineers to build, deploy, and manage AI agents as first-class Kubernetes resources. These agents can perform complex tasks by leveraging a suite of tools (MCP, et. al.) and other Agents (over A2A) within the cloud-native ecosystem. By utilizing Kubernetes Custom Resource Definitions (CRDs), kagent ensures that AI agents are scalable, observable, and seamlessly integrated into existing Kubernetes workflows.

However, as these agents interact with various LLMs, whether hosted on-premises or by external providers, the need for controlled and secure outbound communication becomes critical. kgateway addresses this by acting as a smart AI egress gateway.

Positioned between the AI agents and the LLMs they access, kgateway provides:
- Security Enforcement: Implements organization-wide policies to ensure that outbound requests comply with security and compliance standards.
- Observability: Offers detailed metrics and tracing for all AI-related traffic, facilitating monitoring and debugging.
- Traffic Management: Enables features like A/B testing, traffic splitting, and canary deployments across different LLMs
See this Demo here:
By integrating kgateway as an egress gateway, organizations gain fine-grained control over the interactions between AI agents and LLMs, ensuring that all communications are secure, compliant, and observable.
In essence, while kagent orchestrates the lifecycle and operation of AI agents within Kubernetes, kgateway governs their interactions with external LLMs. Together, they provide a comprehensive, Kubernetes-native infrastructure for deploying, managing, and securing AI-driven applications at scale.