Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions .changeset/violet-gorillas-attack.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
---
'@srcbook/components': patch
'@srcbook/api': patch
'@srcbook/web': patch
'srcbook': patch
---

This change gives Srcbook an MCP client, giving Srcbook the capability to connect to a wide array of MCP servers that provide the LLM with tools and resources. For now, this only runs in local dev: local production/Docker will be addressed in a follow-up PR.
4 changes: 4 additions & 0 deletions .dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
node_modules
.turbo
.pnpm-store
.git
11 changes: 10 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -42,4 +42,13 @@ srcbook/lib/**/*
# Aide
*.code-workspace

vite.config.ts.timestamp-*.mjs
vite.config.ts.timestamp-*.mjs

# MCP Testing
packages/api/test/mcp_tests

# MCP Config
packages/api/srcbook_mcp_config.json

## PR scratch
packages/api/PR_MARKUP.md
129 changes: 129 additions & 0 deletions MCP_README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,129 @@
# Model Context Protocol (MCP) Integration

Srcbook now features a [Model Context Protocol](https://modelcontextprotocol.io) (MCP) client, enabling secure and standardized interactions between your applications and external tools via MCP. Currently, MCP is primarily integrated with the AI-assisted app building functionality.

## Overview

MCP allows Srcbook to:
- Enhance AI code generation with sequential, o1-style thinking
- Access local files securely (when configured)
- Connect to various utility servers

> **Note**: MCP integration is currently focused on the app builder functionality, with notebook integration planned for future releases.

## Getting Started

### Prerequisites

- Srcbook running locally
- Node.js 18+
- Access to your local filesystem

### Configuration

1. Navigate to `packages/api/` in your Srcbook installation directory
2. Locate `srcbook_mcp_config.example.json`
3. Create a new file named `srcbook_mcp_config.json` based on the example
4. Configure the filesystem paths:

```json
{
"mcpServers": {
"memory": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-memory"]
},
"filesystem": {
"command": "npx",
"args": [
"-y",
"@modelcontextprotocol/server-filesystem",
"/PATH/TO/YOUR/DESKTOP",
"/PATH/TO/YOUR/DOWNLOADS"
]
},
"puppeteer": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-puppeteer"]
},
"sequential-thinking": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-sequential-thinking"]
},
"everything": {
"command": "npx",
"args": ["-y", "@modelcontextprotocol/server-everything"]
},
"mcp-installer": {
"command": "npx",
"args": ["@anaisbetts/mcp-installer"]
}
}
}
```

> **Important**: Replace `/PATH/TO/YOUR/DESKTOP` and `/PATH/TO/YOUR/DOWNLOADS` with the actual paths to your Desktop and Downloads folders.

## Available Servers

Srcbook comes with several pre-configured MCP servers that don't require API keys:

- **memory**: Basic memory operations ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/memory))
- **filesystem**: Secure file system access ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/filesystem)) - **Note**: This server is not operational in Srcbook yet.
- **puppeteer**: Browser automation capabilities ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/puppeteer))
- **sequential-thinking**: Enhanced, o1-style reasoning ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/sequentialthinking))
- **everything**: Test server for builders of MCP clients ([source](https://github.com/modelcontextprotocol/servers/tree/main/src/everything))
- **mcp-installer**: MCP server installation utility

## Using MCP in the App Builder

### Sequential Thinking

The primary MCP integration currently available is the sequential thinking feature in the app builder:

1. Open the app builder
2. Toggle sequential thinking on/off using the interface
3. When enabled, the AI code editing process will utilize the sequential-thinking server
4. You can verify server usage by checking your terminal output

## Troubleshooting

### Common Issues

1. **Server Configuration**
- Verify your `srcbook_mcp_config.json` exists and is properly formatted
- Check that filesystem paths are correct and accessible
- Ensure Node.js version is 18+

2. **Sequential Thinking Issues**
- Check terminal output for server connection status
- Verify the server is properly installed via npx
- Restart Srcbook if server connection fails

### Known Limitations

1. **Notebook Integration**
- MCP is not currently integrated with notebook functionality
- Future releases will expand MCP support to notebooks

2. **File Access**
- Limited to configured Desktop and Downloads directories
- Must be explicitly configured in json file

## Getting Help

- Join our [Discord Community](https://discord.gg/shDEGBSe2d)
- File issues on [GitHub](https://github.com/srcbookdev/srcbook)

## Future Development

Planned expansions of MCP functionality include:

1. Notebook integration for code cells
2. Additional server integrations
3. Enhanced file system capabilities
4. Expanded AI assistance features

## Contributing

We welcome contributions to improve MCP integration in Srcbook. Please check our [Contributing Guidelines](CONTRIBUTING.md) before submitting changes.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ Srcbook is open-source (apache2) and runs locally on your machine. You'll need t
- Create, edit and run web apps
- Use AI to generate the boilerplate, modify the code, and fix things
- Edit the app with a hot-reloading web preview
- [Model Context Protocol (MCP)](MCP_README.md) integration for enhanced AI capabilities and secure file access

<picture>
<source media="(prefers-color-scheme: dark)" srcset="https://i.imgur.com/lLJPZOs.png">
Expand Down
5 changes: 4 additions & 1 deletion package.json
Original file line number Diff line number Diff line change
Expand Up @@ -25,10 +25,13 @@
"typescript": "5.6.2"
},
"dependencies": {
"@modelcontextprotocol/sdk": "^1.1.0",
"drizzle-kit": "^0.24.2",
"electron": "^33.3.1",
"turbo": "^2.1.1"
},
"packageManager": "pnpm@9.12.1",
"engines": {
"node": ">=18"
"node": "20.x"
}
}
47 changes: 47 additions & 0 deletions packages/api/ai/generate.mts
Original file line number Diff line number Diff line change
Expand Up @@ -296,3 +296,50 @@ export async function streamEditApp(

return result.textStream;
}

export async function streamEditAppSequential(
projectId: string,
files: FileContent[],
query: string,
appId: string,
planId: string,
) {
// Example: maybe we log that we’re using sequential logic
console.log('[MCP] Using sequential logic for editing app', appId, 'plan:', planId);

// Potentially a different model or prompt
const model = await getModel();

// Optionally define a specialized β€œsequential” system prompt:
const systemPrompt = `You are a helpful AI that uses a sequential chain-of-thought approach. ${makeAppEditorSystemPrompt()}`;

// Reuse or adapt the user prompt
const userPrompt = makeAppEditorUserPrompt(projectId, files, query);

let response = '';

// If you want to call the same streaming approach but with custom prompts:
const result = await streamText({
model,
system: systemPrompt,
prompt: userPrompt,
onChunk: (chunk) => {
if (chunk.chunk.type === 'text-delta') {
response += chunk.chunk.textDelta;
}
},
onFinish: () => {
if (process.env.SRCBOOK_DISABLE_ANALYTICS !== 'true') {
logAppGeneration({
appId,
planId,
llm_request: { model, system: systemPrompt, prompt: userPrompt },
llm_response: response,
});
}
},
});

// Return the streaming body
return result.textStream;
}
39 changes: 39 additions & 0 deletions packages/api/ai/tool-execution.mts
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
import { MCPHub } from '../mcp/mcphub.mjs';
import type { CallToolResultSchema } from '@modelcontextprotocol/sdk/types.js';
import { CallToolRequestSchema } from '../mcp/types.mjs';
import { z } from 'zod';

export class ToolExecutionService {
constructor(private mcpHub: MCPHub) {}

async executeToolStream(request: z.infer<typeof CallToolRequestSchema>): Promise<ReadableStream> {
return new ReadableStream({
start: async (controller) => {
try {
const result: z.infer<typeof CallToolResultSchema> = await this.mcpHub.callTool(
request.serverName,
request.toolName,
request.params
);

controller.enqueue(
JSON.stringify({
type: 'result',
data: result,
}) + '\n'
);

controller.close();
} catch (error: any) {
controller.enqueue(
JSON.stringify({
type: 'error',
data: { message: error.message },
}) + '\n'
);
controller.error(error);
}
},
});
}
}
1 change: 1 addition & 0 deletions packages/api/config.mts
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ async function init() {
aiConfig: { provider: 'openai', model: 'gpt-4o' } as const,
aiProvider: 'openai',
aiModel: 'gpt-4o',
mcpServers: {},
};
console.log();
console.log('Initializing application with the following configuration:\n');
Expand Down
1 change: 1 addition & 0 deletions packages/api/constants.mts
Original file line number Diff line number Diff line change
Expand Up @@ -15,3 +15,4 @@ export const APPS_DIR = path.join(SRCBOOK_DIR, 'apps');
export const DIST_DIR = _dirname;
export const PROMPTS_DIR = path.join(DIST_DIR, 'prompts');
export const IS_PRODUCTION = process.env.NODE_ENV === 'production';
export const PROJECT_DIR = '@/srcbook_mcp_config.json';
2 changes: 1 addition & 1 deletion packages/api/db/index.mts
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@ const DB_PATH = `${HOME_DIR}/.srcbook/srcbook.db`;
fs.mkdirSync(SRCBOOKS_DIR, { recursive: true });

export const db = drizzle(new Database(DB_PATH), { schema });
migrate(db, { migrationsFolder: drizzleFolder });
migrate(db, { migrationsFolder: drizzleFolder });
15 changes: 15 additions & 0 deletions packages/api/db/schema.mts
Original file line number Diff line number Diff line change
@@ -1,6 +1,15 @@
import { sql } from 'drizzle-orm';
import { sqliteTable, text, integer, unique } from 'drizzle-orm/sqlite-core';
import { randomid } from '@srcbook/shared';
import { z } from 'zod';

// Add MCP server config types
export const McpServerConfig = z.object({
host: z.string().url(),
tools: z.array(z.string()).optional(),
});

export type McpServerConfig = z.infer<typeof McpServerConfig>;

export const configs = sqliteTable('config', {
// Directory where .src.md files will be stored and searched by default.
Expand All @@ -19,6 +28,12 @@ export const configs = sqliteTable('config', {
aiBaseUrl: text('ai_base_url'),
// Null: unset. Email: subscribed. "dismissed": dismissed the dialog.
subscriptionEmail: text('subscription_email'),
// Add MCP configuration
mcpServers: text('mcp_servers', { mode: 'json' }).$type<Record<string, {
command: string;
args: string[];
env: string
}>>().default({}),
});

export type Config = typeof configs.$inferSelect;
Expand Down
1 change: 1 addition & 0 deletions packages/api/drizzle/0015_add_mcp_servers.sql
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ALTER TABLE `config` ADD `mcp_servers` text DEFAULT '{}' NOT NULL;
Loading
Loading