Skip to content
NEW BLOG

Using Surrealism to build your own extensions

Read blog

1/2

MCP hero illustration

MCP Server

Connect your AI tools directly to SurrealDB with the Model Context Protocol. Queries, schema exploration, graph traversal, vector search, and ACID transactions - all as structured MCP tool calls.

Connect AI models to your data

Direct AI tool connections

Connect AI models and tools to SurrealDB through the Model Context Protocol. No wrapper APIs or custom integration code.

Seamless AI tool connections illustration

Access all your data models

Query documents, graphs, vectors, and relational data through a unified interface. Your AI can access any data type with a single query language.

Access data models illustration

Real-time AI interactions

AI models query live data, traverse graphs, and access vector embeddings - all through a single MCP connection.

Real-time AI interactions illustration

Secure AI data access

Built-in authentication and row-level permissions ensure AI tools only access authorised data.

Secure AI data access illustration

Developer-friendly integrations

Install the VSCode or Cursor extension, point it at your SurrealDB instance, and start querying.

Developer-friendly integrations illustration

Native vector search

Run semantic search, similarity matching, and RAG pipelines using SurrealDB's built-in vector indexes.

Native vector search illustration

Graph-powered AI

Traverse relationships and analyse multi-hop connections through SurrealDB's native graph engine.

Graph-powered AI illustration

One query language for AI

SurrealQL queries documents, graphs, vectors, and time-series in one statement. No separate data stores to connect.

One query language for AI illustration

Deploy in any environment

Run the MCP server locally via stdio, as an HTTP server, or over Unix sockets.

Local by default

Default stdio transport for local development and IDE integration. Simple, efficient communication with no network overhead.

Local by default illustration

Environment adaptability

Choose the right transport for your deployment: stdio for local tools, HTTP with SSE for remote agents.

Environment adaptability illustration

Cross-platform compatibility

Runs on Windows, macOS, and Linux with identical behaviour across all three.

Cross-platform compatibility illustration

Run locally from your IDE

The MCP server runs locally via stdio. Connect to SurrealDB instances from any IDE, framework, or AI tool. Start an in-memory or local database directly from your AI tool.

1{
2 "mcpServers": {
3 "SurrealDB": {
4 "command": "docker",
5 "args": [
6 "run",
7 "--rm",
8 "-i",
9 "--pull",
10 "always",
11 "surrealdb/surrealmcp:latest",
12 "start"
13 ]
14 }
15 }
16}

Run as an HTTP server

The MCP server can alternatively run as an HTTP server for environments where stdio is not available, with SurrealDB embedded for local instances.

1{
2 "mcpServers": {
3 "SurrealDB": {
4 "command": "docker",
5 "args": [
6 "run",
7 "--rm",
8 "-p",
9 "8080:8080",
10 "--pull",
11 "always",
12 "surrealdb/surrealmcp:latest",
13 "start",
14 "--auth-disabled",
15 "--bind-address",
16 "127.0.0.1:8080",
17 "--server-url",
18 "http://localhost:8080"
19 ]
20 }
21 }
22}

GET STARTED

Start building with MCP

Connect your AI tools directly to SurrealDB with the Model Context Protocol.