Skip to main content

Step 3: Cloud Provider Authentication Secrets

LiteLLM Proxy requires credentials to authenticate with cloud provider services. The recommended authentication method depends on where your Kubernetes cluster is hosted and which AI services you intend to use.

Cloud Authentication Methods

Cluster EnvironmentAWS BedrockGCP Vertex AIAzure OpenAI
AWS (EKS Cluster)
  • IRSA (IAM Roles for Service Accounts)
  • AWS User Credentials
  • GCP Service Account Credentials
  • Azure Entra ID Application Credentials
GCP (GKE Cluster)
  • AWS User Credentials
  • GCP Service Account Credentials
  • Azure Entra ID Application Credentials
Azure (AKS Cluster)
  • AWS User Credentials
  • GCP Service Account Credentials
  • Azure Entra ID Application Credentials

AWS Bedrock Authentication

Required only if you plan to use models from AWS Bedrock.

This method securely associates an IAM role with the LiteLLM Proxy's Kubernetes service account, avoiding the need to store static AWS credentials as secrets.

info

The required IAM Role ARN is automatically generated during the Terraform deployment. You can find it as EKS_AWS_ROLE_ARN in the deployment_outputs.env file.

To enable IRSA, replace %%EKS_AWS_ROLE_ARN%% with EKS_AWS_ROLE_ARN value in your litellm/values-aws.yaml file:

litellm-helm:
serviceAccount:
create: true
annotations:
eks.amazonaws.com/role-arn: '%%EKS_AWS_ROLE_ARN%%'

Option 2: AWS User Credentials

This method is used if you are not running on EKS or prefer to use static credentials.

info

You must create the litellm-aws-auth secret manually before deploying the Helm chart.

Create the secret using the following command, replacing the placeholders with your actual credentials:

kubectl create secret generic litellm-aws-auth \
--namespace litellm \
--from-literal=AWS_ACCESS_KEY_ID="YOUR_AWS_ACCESS_KEY_ID" \
--from-literal=AWS_SECRET_ACCESS_KEY="YOUR_AWS_SECRET_ACCESS_KEY" \
--type=Opaque

Then, ensure your litellm/values-aws.yaml file is configured to mount this secret:

litellm-helm:
# ... other components
environmentSecrets:
- litellm-aws-auth

Azure OpenAI Authentication

Required only if you plan to use models from Azure OpenAI.

Option 1: Azure Entra ID Application (Client Credentials)

Authentication is configured via an Azure Entra ID Application. The deployment process requires the following credentials:

  • AZURE_TENANT_ID
  • AZURE_CLIENT_ID
  • AZURE_CLIENT_SECRET
info

The variables AZURE_TENANT_ID, AZURE_CLIENT_ID, and AZURE_CLIENT_SECRET are available in the deployment_outputs.env file. This file is automatically generated during the Terraform deployment.

When running the installation script and selecting Azure as your cloud provider, you will be prompted to enter them.

Option 2: Direct API key authentication

Content coming soon

Documentation for configuring direct API key authentication will be added soon.

Google Vertex AI Authentication

Required only if you plan to use models from Google Vertex AI.

If you select GCP as your cloud provider during the automated installation, you must provide credentials for Vertex AI.

info

Prerequisite: Before running the script, ensure a valid gcp-service-account.json file is present in the root of the repository. This file is necessary for authentication.

During the script's execution, you will be prompted to enter the following value:

  • VERTEX_PROJECT: Your Google Cloud project ID where Vertex AI is enabled.

Next Steps

Continue to LiteLLM Model Configuration.