Inside the gengine MCP bridge — module layout, skill tree router, tool cache, output compression, context loader, manager API, environment variables, and transport configs.
MCP Bridge
The MCP bridge is a Node.js process that sits between AI clients and the gengine REST API. It translates MCP wire protocol (stdio or HTTP+SSE) into HTTP calls to the C++ plugin, caches tool definitions, compresses large payloads, and injects UE context documentation into the AI's context window.
Module Layout
Resources/mcp-bridge/
├── src/
│ ├── index.ts Entry point — transport selection and startup
│ ├── router.ts Skill tree router — maps MCP tool calls to REST endpoints
│ ├── toolCache.ts Tool definition cache — serves tools/list without hitting the REST API
│ ├── contextLoader.ts UE 5.7 context document injector
│ ├── manager.ts Bridge lifecycle — start, stop, health, restart
│ └── compression.ts Payload compression for large responses
├── dist/ Compiled JavaScript (run `npm run build` to populate)
├── package.json
├── tsconfig.json
└── esbuild.config.mjs Bundle configuration
Skill Tree Router
The router (router.ts) maps incoming MCP tools/call messages to REST API calls on localhost:8080. It is the core of the bridge's translation layer.
// src/router.ts (simplified)
const DOMAIN_ROUTES: Record<string, string> = {
unreal_world: '/mcp/tool',
unreal_assets: '/mcp/tool',
unreal_blueprints: '/mcp/tool',
unreal_animation: '/mcp/tool',
unreal_character: '/mcp/tool',
unreal_input_materials: '/mcp/tool',
unreal_status: '/mcp/tool',
unreal_get_ue_context: '/mcp/tool',
};
export async function routeToolCall(
toolName: string,
args: Record<string, unknown>
): Promise<MCPToolResult> {
const endpoint = DOMAIN_ROUTES[toolName];
if (!endpoint) {
return errorResult(`Unknown tool: ${toolName}`);
}
const response = await fetch(`http://localhost:${GENGINE_PORT}${endpoint}`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
name: toolName,
operation: args.operation,
params: args.params ?? {},
}),
});
const data = await response.json();
return toMCPResult(data);
}
Legacy tool name compatibility
The router also handles legacy tool names that predate the 6-domain system:
const LEGACY_ALIASES: Record<string, string> = {
'unreal_spawn_actor': 'unreal_world',
'unreal_move_actor': 'unreal_world',
'unreal_delete_actors': 'unreal_world',
'unreal_search_assets': 'unreal_assets',
'unreal_create_blueprint': 'unreal_assets',
};
Legacy names resolve to their domain tool equivalent before routing. They are not advertised in tools/list.
Tool Cache
The tool cache (toolCache.ts) fetches tool definitions from the REST API once on startup and serves subsequent tools/list requests from memory. This avoids a REST round-trip on every AI client connection.
// src/toolCache.ts
class ToolCache {
private tools: MCPToolDefinition[] = [];
private lastFetchedAt: number = 0;
private readonly TTL_MS = 60_000; // Refresh every 60 seconds
async getTools(): Promise<MCPToolDefinition[]> {
const now = Date.now();
if (this.tools.length === 0 || now - this.lastFetchedAt > this.TTL_MS) {
await this.refresh();
}
return this.tools;
}
private async refresh(): Promise<void> {
const response = await fetch(`http://localhost:${GENGINE_PORT}/mcp/tools`);
const data = await response.json();
this.tools = data.tools;
this.lastFetchedAt = Date.now();
}
invalidate(): void {
this.lastFetchedAt = 0;
}
}
export const toolCache = new ToolCache();
The cache TTL is 60 seconds. When gengine registers a new tool (e.g., after loading a plugin update), the cache refreshes on the next tools/list call within 60 seconds.
Output Compression
Large tool responses — asset lists, actor dumps, graph dumps — can consume significant tokens. The bridge compresses payloads above 4 KB before forwarding them to the AI.
Compression steps applied in order:
- Remove null fields — JSON fields with
nullvalues are stripped. - Truncate long strings — string values over 512 characters are truncated with a
…[truncated]suffix. - Summarize large arrays — arrays over 50 elements are replaced with the first 50 elements plus a
"_truncated": Ncount field. - Compact whitespace — the JSON is re-serialized without pretty-printing.
- gzip — if the payload is still over 4 KB after steps 1–4, it is gzip-compressed and base64-encoded with a
"_compressed": trueflag.
The AI client (gengine's chat panel) decompresses the payload before displaying it.
Context Loader
The context loader (contextLoader.ts) injects UE 5.7 documentation snippets into the AI's context window when unreal_get_ue_context is called. The docs are bundled as static JSON at build time — no network request is made.
// src/contextLoader.ts (simplified)
import contextDocs from './data/ue-context-docs.json';
export function loadContext(category: string): string {
const doc = contextDocs[category];
if (!doc) {
const available = Object.keys(contextDocs).join(', ');
return `Unknown category '${category}'. Available: ${available}`;
}
return doc;
}
The bundled docs cover the 10 categories listed in the Context System documentation. They are updated with each gengine release to stay current with UE 5.7 point releases.
Manager Control API
The bridge manager (manager.ts) exposes lifecycle control used by the Unreal plugin to start, stop, and monitor the Node.js process:
// HTTP endpoints served by the manager on port 3000
GET /health // { status: 'ok', uptime_ms, version }
POST /restart // Graceful restart — drains in-flight requests first
POST /stop // Graceful shutdown
GET /metrics // Request counts, error rates, cache hit ratio
The plugin polls /health every 5 seconds. If it gets no response for 3 consecutive polls, it triggers an automatic restart (controlled by BridgeRestartOnCrash in Project Settings).
Environment Variables
The bridge reads configuration from environment variables. These are set by the plugin when it launches the bridge process, or can be set manually when running the bridge standalone.
| Variable | Default | Description |
|---|---|---|
GENGINE_PORT | 8080 | Port of the gengine REST API to connect to |
BRIDGE_PORT | 3000 | Port the bridge listens on for HTTP+SSE |
BRIDGE_LOG_LEVEL | info | Log verbosity: debug, info, warn, error |
BRIDGE_COMPRESS_THRESHOLD | 4096 | Payload size in bytes above which compression is applied |
BRIDGE_TOOL_CACHE_TTL | 60000 | Tool cache TTL in milliseconds |
NODE_ENV | production | Set to development for verbose error stacks |
Running the bridge manually
cd Plugins/gengine/Resources/mcp-bridge
# Install dependencies (first time only)
npm install
# Build TypeScript to JavaScript
npm run build
# Run with custom port
GENGINE_PORT=8081 node dist/index.js
# Run with debug logging
BRIDGE_LOG_LEVEL=debug node dist/index.js
Transport Configurations
The bridge supports two MCP transports selected at startup based on environment:
stdio (Claude CLI / Claude Desktop)
When launched as a child process by the Claude CLI, the bridge uses stdio transport — MCP messages travel over stdin/stdout pipes.
// src/index.ts — stdio transport
if (process.env.MCP_TRANSPORT === 'stdio' || !process.env.MCP_TRANSPORT) {
const transport = new StdioServerTransport();
await server.connect(transport);
}
Claude CLI config (~/.claude/claude_desktop_config.json):
{
"mcpServers": {
"gengine": {
"command": "node",
"args": ["C:/path/to/Plugins/gengine/Resources/mcp-bridge/dist/index.js"],
"env": {
"GENGINE_PORT": "8080"
}
}
}
}
HTTP + SSE (OpenAI-compatible clients, Web Chat Panel)
When MCP_TRANSPORT=http is set, the bridge listens for HTTP connections and streams responses via Server-Sent Events.
// src/index.ts — HTTP+SSE transport
if (process.env.MCP_TRANSPORT === 'http') {
const app = express();
app.use('/mcp', createMCPSSEHandler(server));
app.listen(parseInt(process.env.BRIDGE_PORT ?? '3000'));
}
OpenAI-compatible client config:
{
"mcp_server_url": "http://localhost:3000/mcp",
"transport": "sse"
}
Both transports connect to the same REST API backend on port 8080. Switching transports requires only a config change — no code changes in the plugin.