A fully managed platform in Microsoft Foundry for hosting, scaling, and securing AI agents built with any supported framework or model
Add the required api-version query parameter to the agent endpoint URL when sending the POST request.
For the Responses protocol, the correct pattern is:
https://{account}.services.ai.azure.com/api/projects/{project}/agents/{agent}/protocols/openai/v1/responses?api-version=v1
Example curl call:
curl -X POST "https://resource.services.ai.azure.com/api/projects/auto/agents/hr-agent/protocols/openai/v1/responses?api-version=v1" \
-H "Authorization: Bearer {access_token}" \
-H "Content-Type: application/json" \
-d '{
"input": "Hello! What can you do?",
"store": true
}'
Ensure:
-
api-versionis present in the query string (for example,api-version=v1). - The path includes
/agents/{agent}/protocols/openai/v1/responses(not/endpoint/protocols/...unless explicitly required by the preview flavor being used).
If using hosted agents in the refreshed public preview, also include the preview header and the endpoint segment as shown in the hosted-agent example:
curl -X POST "$BASE_URL/agents/my-agent/endpoint/protocols/openai/responses?api-version=v1" \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-H "Foundry-Features: HostedAgents=V1Preview" \
-d '{"input": "Hello!", "model": "gpt-4.1", "stream": false}'
Using the correct URL pattern plus ?api-version=v1 resolves the Missing required query parameter: api-version error.
References: