Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
This article shows you how to configure API-key based authentication for any local or cloud-based LLM endpoints that need it. If you configured Edge RAG to use your own language model instead of a Microsoft provided model, complete the steps in this article.
Important
Edge RAG Preview, enabled by Azure Arc is currently in PREVIEW. See the Supplemental Terms of Use for Microsoft Azure Previews for legal terms that apply to Azure features that are in beta, preview, or otherwise not yet released into general availability.
Set up API Key for authentication
After you install the Edge RAG extension and configure it to use your own language model, get an API key for the model.
In your Azure Local node, get the "bring your own model" (BYOM) secret that was created during the extension installation.
[System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String( (.\kubectl.exe get secret byom-api-key -n arc-rag -o jsonpath="{.data.BYOM_API_KEY}") )) Output: byom-secret
Update the secret value to LLM endpoint API key by deleting and recreating the secret.
kubectl delete secret byom-api-key -n arc-rag $apiKey = "<LLM endpoint API key>" kubectl create secret generic byom-api-key --from-literal=BYOM_API_KEY=$apiKey -n arc-rag
Verify if the secret is set to desired API key
[System.Text.Encoding]::UTF8.GetString([System.Convert]::FromBase64String( ( kubectl get secret byom-api-key -n arc-rag -o jsonpath="{.data.BYOM_API_KEY}" ) )) Output: <Endpoint api key>
Delete the inferencing flow pod to apply the secret change.
kubectl.exe delete pods -n arc-rag -l app=inferencingflow