Edit

Bring your own cross-resource capacity in Content Understanding

Use this guide to connect an external Azure OpenAI or Foundry resource to your Content Understanding resource and route model usage through that connected resource. This setup helps you reuse existing model capacity across resources.

Cross-resource flow overview

Use this high-level diagram to understand how Content Understanding uses a connected resource for model inference.

+---------------------------------------------------------------+
| Azure subscription                                            |
|                                                               |
|  +---------------------------+                                |
|  | Content Understanding     |                                |
|  | resource                  |                                |
|  |                           |                                |
|  | defaults:                 |                                |
|  | gpt-4.1 -> connA/gpt41    |                                |
|  +-------------+-------------+                                |
|                |                                              |
|    analyze API | uses default deployment mapping              |
|                v                                              |
|  +---------------------------+                                |
|  | Connected resource        |                                |
|  | (Azure OpenAI or Foundry) |                                |
|  |                           |                                |
|  | deployments:              |                                |
|  | - gpt-4.1                 |                                |
|  | - text-embedding-3-large  |                                |
|  +---------------------------+                                |
|                                                               |
|  Authentication path: API key or Microsoft Entra ID           |
+---------------------------------------------------------------+

Prerequisites

To get started, make sure you have the following resources and permissions:

Connect an Azure OpenAI or Foundry resource

Connect your model resource from the management center of your Content Understanding resource.

  1. Open your Content Understanding resource in the Azure portal.

  2. Select Go to Azure AI Foundry portal. Screenshot of the Foundry resource overview page with Go to Foundry portal highlighted.

  3. Open Management center. Screenshot of the Foundry overview page with Open in management center highlighted.

  4. Select Connected resources. Screenshot of the Management center navigation with Connected resources highlighted.

  5. Select New connection. Screenshot of the Manage connected resources page with New connection highlighted.

  6. Select Azure OpenAI or Microsoft Foundry. Screenshot of the Add a connection to external assets dialog with Azure OpenAI and Microsoft Foundry highlighted.

  7. Search for and select your resource. Screenshot of the Connect a Microsoft Foundry resource dialog with resource search and Add connection controls.

  8. Select an authentication type, and then select Add connection.

    Authentication details:

    • API key: Content Understanding uses the API key from the connected resource.
      • The connected resource must allow API key authentication.
      • If API key authentication is disabled on the connected resource, requests fail.
    • Microsoft Entra ID: Content Understanding uses the managed identity of the Content Understanding resource.
      • Enable managed identity on the Content Understanding resource.
      • Grant the managed identity access to the connected resource, such as Cognitive Services User.

    Screenshot of the Connect a Microsoft Foundry resource dialog showing Connecting status.

    After the operation completes, the connection appears in Connected resources. Screenshot showing the connected resource listed in Connected resources after setup.

Set default deployments for cross-resource usage

Set resource defaults so analyzers can use the connected deployment with the {ConnectionName}/{DeploymentName} format.

Before you start:

  • Get the connection name from Connected resources.
  • Get the deployment name from Models + endpoints in the connected resource.

Use the defaults API to set model deployments:

PATCH {endpoint}/contentunderstanding/defaults?api-version=2025-11-01
Content-Type: application/json

{
  "modelDeployments": {
    "gpt-4.1": "{ConnectionName}/{DeploymentName}",
    "text-embedding-3-large": "{ConnectionName}/{EmbeddingDeploymentName}"
  }
}

Verify the configuration

Choose one of the following options to verify your setup.

Option 1: Verify with Content Understanding Studio

  1. Follow Quickstart: Try out Content Understanding Studio with the primary resource.
  2. In Studio, run a prebuilt analyzer on a sample file.
  3. Confirm the analysis completes and returns structured results in the results pane.

Option 2: Verify with the REST quickstart

  1. Follow Quickstart: Use Azure Content Understanding in Foundry Tools REST API.
  2. Run the sample request in Send a file for analysis.
  3. Confirm the operation succeeds by checking Get analyze result and verifying status is Succeeded.

If either verification path succeeds, your Content Understanding resource is using the connected cross-resource capacity.