AI Integration & Local Knowledge Core
Guide to the high-performance local MPC Server and RAG pipeline.
Last updated: 2/24/2026
AI Integration & Local Knowledge Core (MPC)
SveltyCMS 2026 leverages a high-performance Model Context Protocol (MCP) server to provide AI agents with deep, real-time project knowledge.
[!TIP] Status: Active. Your AI integration is powered by the SveltyCMS Knowledge Core.
- Hosted (Primary):
https://mcp.sveltycms.com/mcp(Cloud-native agentic memory)- Local (RAG): Local environments using LanceDB and hardware acceleration.
1. Online MCP Server (https://mcp.sveltycms.com)
The hosted Knowledge Core provides absolute project context for remote AI agents:
- SSE Connection: Uses standard Server-Sent Events for real-time tool access.
- Global Search: Semantic search across the entire SveltyCMS ecosystem.
- Read-Only Safety: Agents can study the codebase and docs but cannot modify data.
2. The Local Knowledge Core
This service acts as the “Long-term Memory” for SveltyCMS locally:
- Hardware Accelerated Indexing: Uses local AI acceleration (e.g. NPU/GPU) to handle embeddings.
- Deep Implementation Knowledge: Indexes
src/,docs/, andpackage.jsonexpert patterns. - Expert Discovery: Automatically syncs with
llms-full.txtfrom Svelte 5, Skeleton UI, and Valibot.
3. AI-Native Generative Layouts
SveltyCMS pushes beyond Chat-based AI into interactive generative layouts powered by json-render-svelte.
When a user prompts the MCP Server or the local AI service to build a complex collection form or a custom dashboard view, the LLM outputs a structured JSON spec. This spec is then fed directly into our <Renderer> component, instantly generating a fully functional, state-bound Svelte 5 interface utilizing your exact widgets and plugins without touching the codebase.
Where Generative UI is Integrated:
- Agentic Dashboard: The main dashboard features an “AI Dashboard Mode” that bypasses manual grid configuration. It now supports interactive re-prompting, allowing users to refine the layout in real-time. A live status indicator confirms connection to the hosted Knowledge Core at
mcp.sveltycms.com. - Collection Builder: Translates schema configurations into live preview specs for immediate visual feedback of your custom fields using the same rendering engine.
- AI Service: The robust local
AIServicedirectly integrates with Ollama to expose thegenerateLayoutSpec()method, which utilizes a specialized catalog of components includingVerticalLayout,HorizontalLayout, andText.
Core Generative Components
SveltyCMS provides a suite of AI-native components registered in the sveltyRegistry:
- Layouts:
VerticalLayoutandHorizontalLayoutfor structural composition. - Typography:
Textwidget for content rendering and AI snapshots. - Controls: Smart control wrappers that bind AI-generated specs to the local component state.
Why AI Makes SveltyCMS Superior
Most headless CMS platforms boast “AI Integration,” but essentially just bolt a ChatGPT wrapper onto a rich text field. SveltyCMS is built from the ground up to be Agentic.
- Zero-Boilerplate Customization: Need a highly specific dashboard for a niche real estate tenant? You no longer write manual Svelte code. The MCP understands your available widgets and generates the interface instantly via
json-render-svelte. - Plugin Superpowers: Any Marketplace Plugin that registers itself with
jsonRender: trueinstantly becomes part of the AI’s vocabulary, meaning third-party tools can be effortlessly composed into Generative Layouts. - Local Privacy First: By splitting knowledge (Hosted MCP Server) from inference (Local Ollama via
AIService), enterprise users get the intelligence of the official docs without sending their private schema or dashboard data to third-party endpoints. - Self-Healing & Understanding: The MCP Server constantly indexes your
src/anddocs/, meaning SveltyCMS understands its own architecture.
Usage for Developers
Background Sync
To keep the knowledge base always up-to-date, run the sync service:
bun run sync-service.ts
Online Context Server (Recommended)
Add this to your MCP configuration (e.g., claude_desktop_config.json):
{
"mcpServers": {
"sveltycms": {
"url": "https://mcp.sveltycms.com/mcp"
}
}
}
Local Executor Integration
If running the local core directly:
"command": "bun",
"args": ["run", "./mpc-server/index.ts"]
Available Tools
Once connected, your AI agent can use these read-only tools to explore the SveltyCMS codebase and documentation:
| Tool | Description | Parameters |
|---|---|---|
search_knowledge_base |
Semantic search across SveltyCMS docs and code via neural vector similarity. | query (string), category (code|docs|meta|external), limit (number) |
read_source_code |
Read the full content of any source file from the SveltyCMS project. | filePath (string, e.g., src/widgets/index.ts) |
list_directory |
List files and folders in a given directory. | dirPath (string, relative directory path) |
get_file_tree |
Get a recursive tree of the project structure up to a specified depth. | depth (number, default: 3) |
All tools are completely read-only — your agent can explore and learn, but cannot modify, delete, or create anything. Sensitive paths like .git, node_modules, and .env are automatically blocked.
Deployment & Updates
The SveltyCMS MCP server follows a “Local Build, Remote Host” model:
- Local Build: The MCP server is built and bundled on your local development machine using
bun run build:mcp(or equivalent build process). - Upload: The resulting bundle/knowledge core is then uploaded to
https://mcp.sveltycms.com(or your private hosted instance). - Consumption: AI Agents (like Antigravity or Claude) connect to the hosted endpoint to obtain the most recent project context, regardless of where they are running.
This ensures that the sensitive build process remains local, while the structured agentic knowledge is globally accessible exactly when needed.