New LLM-Focused Components
Slack announced the release of the Slack MCP server and Real-time Search API, two major additions designed to enhance how Large Language Models (LLMs) and AI agents interact with Slack workspace data.
Slack MCP Server
The Model Context Protocol (MCP) server enables AI agents to interact with Slack content through tools purpose-built for LLM consumption. Unlike traditional APIs, the MCP server is specifically architected for language models, with robust descriptions and examples that return natural language responses. This allows AI agents to perform LLM-driven discovery, configuration, and execution within Slack.
Real-time Search API
The Data Access API has evolved into the Real-time Search API, providing users with a secure search interface for accessing Slack data. This API allows third-party applications to retrieve relevant Slack data without storing customer information on external servers, reducing privacy and compliance concerns for enterprises.
Granular Permission Scopes
The assistant.search.context API method now uses a set of granular scopes instead of a single search:read scope:
search:read.public(required) — for public channel accesssearch:read.private— for private channels (with user consent)search:read.im— for direct messages (with user consent)search:read.mpim— for multi-party direct messages (with user consent)
This enables more granular control over what data AI-enabled apps can access, improving security and user privacy while maintaining flexibility for developers.
Getting Started
Developers can access the Slack MCP server documentation and the Real-time Search API guide to begin integrating these capabilities into their applications.