Edit

Get started with the local MCP server for Real-Time Intelligence (preview)

The local RTI MCP server lets AI agents or AI applications interact with Real-Time Intelligence (RTI) or Azure Data Explorer (ADX) by providing tools through the MCP interface. RTI MCP makes it easy to query and analyze data.

MCP support for RTI and ADX is a full open-source MCP server implementation for Microsoft Fabric Real-Time Intelligence (RTI). Customers need to install, host, and manage the deployment.

Scenarios

The most common scenario for using the local RTI MCP Server is to connect to it from an existing AI client, such as Cline, Claude, and GitHub Copilot. The client can then use all the available tools to access and interact with RTI or ADX resources using natural language. For example, you could use GitHub Copilot agent mode with the RTI MCP Server to list KQL databases or ADX clusters or run natural language queries on RTI Eventhouses.

Architecture

The local RTI MCP Server is at the core of the system and acts as a bridge between AI agents and data sources. Agents send requests to the MCP server, which translates them into Eventhouse queries. The RTI MCP server runs locally and provides read‑only access to Fabric.

Diagram that shows the local MCP server architecture.

The local RTI MCP Server acts as a bridge between AI-powered applications and your data in Fabric. It runs locally and provides read-only access to Eventhouse databases.

The architecture follows the MCP client-server model:

  • MCP Host: The application where AI interactions happen. For example, Visual Studio Code with GitHub Copilot, Claude Desktop, Cline. The host contains the AI model connection, a tool orchestrator, and one or more MCP clients.
  • MCP Server: A lightweight service that exposes specific capabilities as structured tools. The RTI MCP server exposes tools like "execute query," "list databases," and "list tables" that translate into Eventhouse operations.

Any application that supports MCP can connect to the local RTI MCP server using the same protocol. This can be an interactive product like GitHub Copilot or a programmatic AI agent framework.

Key features

Real-Time Data Access: Retrieve data from KQL databases in seconds.

Natural Language Interfaces: Ask questions in plain English or other languages, and the system turns them into optimized queries (NL2KQL).

Schema Discovery: Discover schema and metadata, so you can learn data structures dynamically.

Plug-and-Play Integration: Connect MCP clients like GitHub Copilot, Claude, and Cline to RTI with minimal setup because of standardized APIs and discovery mechanisms.

Local Language Inference: Work with your data in your preferred language.

Supported RTI components

Eventhouse - Run KQL queries against the KQL databases in your Eventhouse backend. This unified interface lets AI agents query, reason, and act on real-time data.

Eventstreams - Query and manage Eventstreams to analyze streaming data and get real-time insights. You can list the eventstreams in your workspace, get details and definitions, create new eventstreams, and more.

Activator - Interact with Fabric Activator to list Activator artifacts in your workspace, create trigger actions, and set up notifications.

Map - Query and manage Map resources to visualize data and create geospatial insights. You can list maps in your workspace, visualize data on maps, get details and definitions, create new maps, and more.

Note

You can also use the Fabric RTI MCP Server to run KQL queries against the clusters in your Azure Data Explorer backend.

Install

To install the local RTI MCP server, follow the open source instructions in the RTI MCP server repository. The repository includes documentation on installation, configuration, and usage of the MCP server with RTI.