Edit

Share via


Network isolation in prompt flow

You can secure prompt flow using private networks. This article explains the requirements to use prompt flow in an environment secured by private networks.

Involved services

When you develop AI applications using prompt flow, you need a secured environment. You can configure network isolation for the following services:

Core Azure Machine Learning services

  • Workspace: Configure the Azure Machine Learning workspace as private and restrict its inbound and outbound traffic.
  • Compute resource: Apply inbound and outbound rules to limit compute resource access within the workspace.
  • Storage account: Restrict storage account accessibility to a specific virtual network.
  • Container registry: Secure your container registry using virtual network configuration.
  • Endpoint: Control which Azure services or IP addresses can access your deployed endpoints.

Azure AI Services

  • Azure OpenAI: Use network configuration to make Azure OpenAI private, then use private endpoints for Azure Machine Learning communication.
  • Azure Content Safety: Configure private network access and establish private endpoints for secure communication.
  • Azure AI Search: Enable private network settings and use private endpoints for secure integration.

External resources

  • Non-Azure resources: For external APIs like SerpAPI, add FQDN rules to your outbound traffic restrictions to maintain access.

Options in different network setups

In Azure Machine Learning, we have two options to secure network isolation: bring your own network or use a workspace-managed virtual network. Learn more about Secure workspace resources.

Here's a table to illustrate the options in different network setups for prompt flow.

Ingress Egress Compute type in authoring Compute type in inference Network options for workspace
Public Public Serverless (recommended), Compute instance Managed online endpoint (recommended) Managed (recommended)
Public Public Serverless (recommended), Compute instance K8s online endpoint Bring your own
Private Public Serverless (recommended), Compute instance Managed online endpoint (recommended) Managed (recommended)
Private Public Serverless (recommended), Compute instance K8s online endpoint Bring your own
Public Private Serverless (recommended), Compute instance Managed online endpoint Managed
Private Private Serverless (recommended), Compute instance Managed online endpoint Managed
  • In private virtual network scenarios, we recommend using a workspace-enabled managed virtual network. It's the easiest way to secure your workspace and related resources.
  • The use of managed vNet and bring your own virtual network in a single workspace isn't supported. Additionally, since managed online endpoint is supported only with a managed virtual network, you can't deploy prompt flow to managed online endpoint in a workspace with an enabled bring your own virtual network.
  • You can have one workspace for prompt flow authoring with your own virtual network, and another workspace for prompt flow deployment using a managed online endpoint with a workspace-managed virtual network.

Secure prompt flow with workspace-managed virtual network

A workspace-managed virtual network is the recommended way to support network isolation in prompt flow. It provides an easy configuration to secure your workspace. After you enable managed vNet at the workspace level, resources related to the workspace in the same virtual network will use the same network settings at the workspace level. You can also configure the workspace to use private endpoints to access other Azure resources such as Azure OpenAI, Azure content safety, and Azure AI Search. You can also configure FQDN rules to approve outbound connections to non-Azure resources used by your prompt flow such as SerpAPI.

  1. Follow workspace-managed network isolation to enable workspace-managed virtual network.

    Important

    The creation of the managed virtual network is deferred until a compute resource is created or provisioning is manually started. You can use the following command to manually trigger network provisioning.

    az ml workspace provision-network --subscription <sub_id> -g <resource_group_name> -n <workspace_name>
    
  2. Add workspace MSI as Storage File Data Privileged Contributor to the storage account linked with the workspace.

    2.1 Go to Azure portal and find the workspace.

    Diagram showing how to go from Azure Machine Learning portal to Azure portal.

    2.2 Find the storage account linked with the workspace.

    Diagram showing how to find workspace linked storage account in Azure portal.

    2.3 Navigate to the role assignment page of the storage account.

    Diagram showing how to jump to role assignment of storage account.

    2.4 Find the storage file data privileged contributor role.

    Diagram showing how to find storage file data privileged contributor role.

    2.5 Assign the storage file data privileged contributor role to the workspace managed identity.

    Diagram showing how to assign storage file data privileged contributor role to workspace managed identity.

    Note

    This operation might take several minutes to take effect.

  3. If you want to communicate with private Azure AI Services, you need to add related user-defined outbound rules to the related resource. The Azure Machine Learning workspace creates a private endpoint in the related resource with autoapproval. If the status is stuck in pending, go to the related resource to approve the private endpoint manually.

    Screenshot of user defined outbound rule for Azure AI Services.

    Screenshot of user approve private endpoint.

  4. If you're restricting outbound traffic to only allow specific destinations, you must add a corresponding user-defined outbound rule to allow the relevant FQDN.

    Screenshot of user defined outbound rule for non Azure resource.

  5. In workspaces that enable managed VNet, you can only deploy prompt flow to managed online endpoints. You can follow Secure your managed online endpoints with network isolation to secure your managed online endpoint.

Secure prompt flow using your own virtual network

Known limitations

  • Managed online endpoints with selected egress require a workspace with managed vNet. If you're using your own virtual network, consider this two-workspace approach: - Use one workspace with your virtual network for prompt flow authoring - Use a separate workspace with managed vNet for prompt flow deployment via managed online endpoint

Next steps