The Model Context Protocol (MCP) is an emerging open-source standard designed to act as a universal interface, enabling AI assistants to securely connect with external data and tools—without requiring custom integrations. Think of it as a “USB for AI applications”, providing a standard way for large language models (LLMs) to discover and use capabilities in other tools.

Availability

General Availability is planned for October 2025.

MCP Architecture

MCP uses a client/server architecture:
  • MCP Host Application (e.g., Claude Desktop, Cursor, or a custom AI app):
    The AI application a user interacts with. It uses an MCP client to communicate with servers.
  • MCP Server (e.g., Glean MCP server):
    Exposes tools that the host application’s LLM can invoke.

Key Benefits of Glean MCP Integration

Unlike native MCP servers from individual apps (e.g., Jira, Slack) that offer siloed results, the Glean MCP server queries Glean’s unified Knowledge Graph. This delivers more relevant, permission-aware results across all connected sources.

Accessing Enterprise Context

Bring Glean’s permission-aware enterprise context into your preferred MCP hosts. Tools like search, chat, and read_document surface organization-specific content (documents, tickets, people, code) while enforcing user permissions, so users can act with the right context without leaving their workflow.

Enhanced Security

While MCP’s open standard has an evolving security model, Glean’s integration is built on its permission-aware Knowledge Graph, enforcing strict user-level access control.

Ecosystem Compatibility

As an industry standard, MCP reduces vendor lock-in. Glean’s approach to MCP makes it a “plug-and-play” component for any compliant AI agent or app.

Bring Glean to Any App

Developers and power users can access Glean’s search, chat, and agents directly in their preferred tools (e.g., Cursor, VS Code, Claude Desktop, ChatGPT) reducing context switching.

Common Use Cases

  • Enterprise context in AI tools: Enable permission-aware search and document retrieval within editors and chat apps.
  • Developer workflows: Use Glean context in IDEs (e.g., Cursor, VS Code) for debugging, PR reviews, and code navigation.
  • Contextual Q&A: Run Glean Search and Chat inside hosts like Claude Desktop and ChatGPT to answer work-specific questions.

Supported Hosts

Supported hosts

The following hosts are supported for connecting to the Glean MCP server. Install type and key constraints are noted for clarity.
  • ChatGPT — Admin-managed install; fixed endpoint https://[instance-name]-be.glean.com/mcp/chatgpt . Connection: HTTP (managed). Tools: search, fetch. Auth: OAuth (recommended).
  • Claude Code — End-user install. Connection: Native HTTP (also supports stdio). Platforms: macOS, Linux, Windows. Tools: search, chat, read_document. Auth: OAuth (recommended).
  • Claude for Desktop — End-user install. Connection: stdio only; requires mcp-remote to bridge to HTTP servers. Platforms: macOS, Windows, Linux. Tools: search, chat, read_document.
  • Claude for Teams/Enterprise — Admin-managed install. Connection: HTTP (managed). Servers configured at organization level; no local config. Tools available per admin policy.
  • Cursor — End-user install. Connection: Native HTTP. Platforms: macOS, Linux, Windows. Tools: search, chat, read_document. Auth: OAuth (recommended).
  • Goose — End-user install. Connection: Native HTTP (YAML config). Platforms: macOS, Linux, Windows. Tools: search, chat, read_document.
  • Visual Studio Code — End-user install. Connection: Native HTTP (one-click protocol vscode:). Platforms: macOS, Linux, Windows. Tools: search, chat, read_document.
  • Windsurf — End-user install. Connection: stdio only; requires mcp-remote to bridge to HTTP servers. Platforms: macOS, Linux, Windows. Tools: search, chat, read_document.
For host-specific connection details, use the MCP Configurator.
Copy host-specific URLs and instructions from the MCP Configurator: Install MCP Hosts

Using Tools Effectively

LLMs choose tools based on your prompt and the server’s tool schema. To improve tool selection across hosts:
  • State the data or action you want explicitly (e.g., “search Glean for …”, “fetch the document …”).
  • Reference tool-friendly inputs like document links or IDs when you have them.
  • Ask the assistant to explain which tool it plans to use when debugging behavior.

Comparing Integration Approaches

Glean MCP Server vs. Glean APIs

MCP Server
  • Built for LLM “tool use” via the MCP standard.
  • Ideal for integrating Glean into AI workflows without writing custom code.
Glean APIs (Client & Indexing)
  • Offer low-level access for custom apps.
  • Greater flexibility and control over how enterprise context from Glean is surfaced to your users, but requires your developers to maintain integration logic.

Resources