BYO OpenAI-compatible model#
You can bring your own model from an OpenAI API-compatible LLM provider. The example integrates with Cohere AI.
-
Save your API key from the OpenAI-compatible provider as an environment variable. For example, navigate to the Cohere AI dashboard.
export PROVIDER_API_KEY=Dgs... -
Create a Kubernetes secret that stores your API key. Make sure to create the secret in the same namespace as you plan to create your agent, such as
kagent.kubectl create secret generic kagent-my-provider -n kagent --from-literal PROVIDER_API_KEY=$PROVIDER_API_KEY -
Create a ModelConfig resource.
kubectl apply -f - <<EOFapiVersion: kagent.dev/v1alpha2kind: ModelConfigmetadata:name: my-provider-confignamespace: kagentspec:apiKeySecret: kagent-my-providerapiKeySecretKey: ${PROVIDER_API_KEY}model: command-a-03-2025provider: OpenAIopenAI:baseUrl: "https://api.cohere.ai/compatibility/v1"EOFReview the following table to understand this configuration. For more information, see the API docs.
Setting Description apiKeySecretThe name of the Kubernetes secret that stores your API key. apiKeySecretKeyThe key in the secret that stores your API key. modelThe OpenAI API-compatible model to use. For more information about the model, consult your LLM provider's documentation. For example, you might use command-a-03-2025for Cohere AI.providerTo use an OpenAI API-compatible model, set the provider to OpenAI.openAIAdditional provider details. For available settings, consult your LLM provider's documentation. At the least, you must configure the baseUrlsetting to point to the endpoint of your LLM provider.baseUrlThe base URL of your LLM provider. Note that the LLM provider might have a special base URL for OpenAI compatibility, such as "https://api.cohere.ai/compatibility/v1"for Cohere AI.
Good job! You added a model to kagent. Next, you can create or update an agent to use this model.
TLS Configuration#
To secure communication to LLMs with your own custom certificates, configure the TLS CA details in the ModelConfig. Then, your agents communicate with the LLM with those custom certificates. This feature is useful for internal or company-managed LLM servers.
Note: TLS configuration only supports OpenAI-compatible providers.
Use Case: Custom CA Certificates#
Configure the CA certificate of your LLM server in the ModelConfig resource.
-
Create a
Secretwith the relevant CA certificate in the same namespace as theModelConfig.kubectl -n kagent create secret generic llm-certs \--from-file=ca.crt=ca.crt -
Create a
ModelConfigresource with TLS configuration that references the CA certificateSecret.apiVersion: kagent.dev/v1alpha2kind: ModelConfigmetadata:name: internal-llm-model-confignamespace: kagentspec:apiKeySecret: kagent-my-providerapiKeySecretKey: ${PROVIDER_API_KEY}provider: OpenAImodel: ${MODEL_NAME}openAI:baseUrl: ${COMPATIBLE_PROVIDER_URL}tls:caCertSecretRef: llm-certscaCertSecretKey: ca.crt
Use Case: Insecure Communication#
Warning: Insecure communication is for demo purposes only. Do not use insecure communication in production environments.
For development or testing scenarios where you need to disable TLS verification, configure the ModelConfig to skip certificate verification.
apiVersion: kagent.dev/v1alpha2kind: ModelConfigmetadata:name: internal-llm-model-confignamespace: kagentspec:apiKeySecret: kagent-my-providerapiKeySecretKey: ${PROVIDER_API_KEY}provider: OpenAImodel: ${MODEL_NAME}openAI:baseUrl: ${COMPATIBLE_PROVIDER_URL}tls:disableVerify: true
TLS Configuration Settings#
Review the following table to understand the TLS configuration options. For more information, see the API docs.
| Setting | Description |
|---|---|
tls.caCertSecretRef | The name of the Kubernetes secret that contains the CA certificate. The secret must be in the same namespace as the ModelConfig. |
tls.caCertSecretKey | The key in the secret that stores the CA certificate file. |
tls.disableVerify | When set to true, disables TLS certificate verification. Warning: Only use this for demo purposes, not in production. |