I'm trying to call my AI Foundry Agent service via API using postman. I keep getting 401 errors.

DMcCrea 40 Reputation points
2025-06-28T04:06:19.4266667+00:00

Using the Azure AI foundry endpoint along with my API key, and after looking at the documentation examples - pasting those in, trying various permutations and with help from ChatGPT, I've had no luck. My main example: Using POST {azure project endpoint}/openai/deployments/gpt-4o-mini/chat/completions?api-version=2024-05-01

With api-key: [my key]

Content-Type: application/json

I checked over and over the keys, regenerated keys, re-checked. Tried using various other endpoints on the platform. Tried using Bearer token instead. I just keep getting

{
    "error": {
        "code": "401",
        "message": "Access denied due to invalid subscription key or wrong API endpoint. Make sure to provide a valid key for an active subscription and use a correct regional API endpoint for your resource."
    }
}
Azure AI Language
Azure AI Language
An Azure service that provides natural language capabilities including sentiment analysis, entity extraction, and automated question answering.
0 comments No comments
{count} votes

Accepted answer
  1. Jerald Felix 4,450 Reputation points
    2025-06-28T06:14:32.15+00:00

    Hi @DMcCrea,

    That 401 is almost always a routing or authentication mismatch. In your call you’re mixing the Azure OpenAI style route (/openai/deployments/...) with an Azure AI Foundry Models resource and key. Foundry resources use a different base URL:

    https://<your-resource>.services.ai.azure.com/models
    

    and the chat-completions route is simply:

    POST https://<your-resource>.services.ai.azure.com/models/chat/completions?api-version=2024-05-01
    

    What to change in Postman

    Setting Correct value Why
    URL https://<resource>.services.ai.azure.com/models/chat/completions?api-version=2024-05-01 Foundry’s Model Inference endpoint(learn.microsoft.com)
    URL https://<resource>.services.ai.azure.com/models/chat/completions?api-version=2024-05-01 Foundry’s Model Inference endpoint([learn.microsoft.com](https://learn.microsoft.com/en-us/azure/ai-foundry/model-inference/concepts/endpoints"Endpoints for Azure AI Foundry Models - Azure AI Foundry Microsoft Learn"))
    Header api-key: <KEY1 or KEY2>Content-Type: application/json Foundry accepts the same key header as Azure OpenAI([learn.microsoft.com](https://learn.microsoft.com/en-us/azure/ai-foundry/model-inference/concepts/endpoints"Endpoints for Azure AI Foundry Models - Azure AI Foundry Microsoft Learn"))
    Body json { "model":"gpt-4o-mini", "messages":[{ "role":"user","content":"Hello!"}] } model should match the deployment name you see in the portal

    A minimal curl that should return 200:

    curl -X POST \
      "https://myfoundry.services.ai.azure.com/models/chat/completions?api-version=2024-05-01" \
      -H "api-key: <YOUR_KEY>" \
      -H "Content-Type: application/json" \
      -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"Hello"}]}'
    

    Calling an Agent instead of a raw model?

    If you actually want to hit an Agent Service project (threads / assistants flow) you switch to AAD tokens, not keys, and the endpoint shape changes again:

    https://<resource>.services.ai.azure.com/api/projects/<project-name>/...
    Authorization: Bearer <token from `az account get-access-token --resource https://ai.azure.com`>
    

    Full token/endpoint example is in the agent quick-start docs.

    Troubleshooting checklist

    Double-check you copied the full key with no trailing spaces and that the key hasn’t been regenerated.

    Make sure the resource name in the host matches the one that generated the key (region must match too).

    Verify the deployment name (model property in the body) exactly matches what you see in Foundry ▶ Deployments.

    Use the GA API version 2024-05-01 unless you specifically need a preview feature.

    With the correct endpoint + header your request should authenticate and return 200. Let me know if you are still facing a problem.

    Best Regards,

    Jerald Felix

    1 person found this answer helpful.

0 additional answers

Sort by: Most helpful

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.