Share via


AIProjectClient Class

AIProjectClient.

Constructor

AIProjectClient(endpoint: str, credential: TokenCredential, **kwargs: Any)

Parameters

Name Description
endpoint
Required
str

Project endpoint. In the form "https://your-ai-services-account-name.services.ai.azure.com/api/projects/_project" if your Foundry Hub has only one Project, or to use the default Project in your Hub. Or in the form "https://your-ai-services-account-name.services.ai.azure.com/api/projects/your-project-name" if you want to explicitly specify the Foundry Project name. Required.

credential
Required

Credential used to authenticate requests to the service. Required.

Keyword-Only Parameters

Name Description
api_version
str

The API version to use for this operation. Default value is "2025-05-15-preview". Note that overriding this default value may result in unsupported behavior.

Variables

Name Description
agents

The AgentsClient associated with this AIProjectClient.

connections

ConnectionsOperations operations

telemetry

TelemetryOperations operations

evaluations

EvaluationsOperations operations

datasets

DatasetsOperations operations

indexes

IndexesOperations operations

deployments

DeploymentsOperations operations

red_teams

RedTeamsOperations operations

Methods

close
get_openai_client

Get an authenticated AzureOpenAI client (from the openai package) to use with AI models deployed to your AI Foundry Project or connected Azure OpenAI services.

Note

The package openai must be installed prior to calling this method.

send_request

Runs the network request through the client's chained policies.


>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client.send_request(request)
<HttpResponse: 200 OK>

For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request

close

close() -> None

get_openai_client

Get an authenticated AzureOpenAI client (from the openai package) to use with AI models deployed to your AI Foundry Project or connected Azure OpenAI services.

Note

The package openai must be installed prior to calling this method.

get_openai_client(*, api_version: str | None = None, connection_name: str | None = None, **kwargs) -> OpenAI

Keyword-Only Parameters

Name Description
api_version

The Azure OpenAI api-version to use when creating the client. Optional. See "Data plane - Inference" row in the table at https://learn.microsoft.com/azure/ai-foundry/openai/reference#api-specs. If this keyword is not specified, you must set the environment variable OPENAI_API_VERSION instead.

Default value: None
connection_name

Optional. If specified, the connection named here must be of type Azure OpenAI. The returned OpenAI client will use the inference URL specified by the connected Azure OpenAI service, and can be used with AI models deployed to that service. If not specified, the returned OpenAI client will use the inference URL of the parent AI Services resource, and can be used with AI models deployed directly to your AI Foundry project.

Default value: None

Returns

Type Description
<xref:openai.AzureOpenAI>

An authenticated AzureOpenAI client

Exceptions

Type Description

if an Azure OpenAI connection does not exist.

azure.core.exceptions.ModuleNotFoundError

if the openai package is not installed.

if the connection name is an empty string.

send_request

Runs the network request through the client's chained policies.


>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client.send_request(request)
<HttpResponse: 200 OK>

For more information on this code flow, see https://aka.ms/azsdk/dpcodegen/python/send_request

send_request(request: HttpRequest, *, stream: bool = False, **kwargs: Any) -> HttpResponse

Parameters

Name Description
request
Required

The network request you want to make. Required.

Keyword-Only Parameters

Name Description
stream

Whether the response payload will be streamed. Defaults to False.

Default value: False

Returns

Type Description

The response of your network call. Does not do error handling on your response.

Attributes

agents

Get the AgentsClient associated with this AIProjectClient. The package azure.ai.agents must be installed to use this property.

Returns

Type Description

The AgentsClient associated with this AIProjectClient.