Beginner

Diagnose and fix the most common gengine problems: plugin not loading, white screen, connection failures, tools not showing, and Node.js issues.

Troubleshooting

This page covers the most common gengine problems and their fixes. Start with the Diagnostics tab — it identifies which component is failing before you dig into logs.

Start Here: Diagnostics Tab

Open the Command Center and click the Diagnostics tab. Each component shows a color-coded status:

ComponentGreen meansRed means
PluginC++ modules loadedDLL load failure or missing dependency
REST APIHTTP server listening on port 8080Port conflict or startup crash
MCP BridgeNode.js process runningNode.js not found, script error, or crash
AI ProviderProvider reachable and authenticatedNetwork issue, bad API key, or no tool support
Tool RegistryAll 97 operations registeredTool registration failure

Click any red card to see the specific error message, most likely cause, and numbered fix steps.

Plugin Not Loading

Symptom: gengine does not appear in the Window menu, or the editor shows "Plugin failed to load" on startup.

Check 1: UE version

gengine requires Unreal Engine 5.7 or later. Open Help > About Unreal Editor and verify the version. If you are on 5.6 or earlier, update the engine.

Check 2: Plugin enabled

Open Edit > Plugins, search for "gengine", and confirm the Enabled checkbox is checked. Restart the editor after enabling.

Check 3: Missing binaries

If you cloned the plugin from source, you need to build it:

cd Plugins/gengine
# Build from Unreal's build tool, or open the .uproject and let UE rebuild

If you installed from the Marketplace, binaries are pre-built — check that the download completed fully.

Check 4: DLL dependency

On Windows, open the Output Log immediately after the failed load:

Window > Output Log

Search for gengine in the log. A line like LogModuleManager: Error: Unable to load module 'Gengine' followed by a Windows error code indicates a missing Visual C++ runtime. Install Visual C++ Redistributable 2022 from Microsoft.

White Screen / Blank Chat Panel

Symptom: The Command Center opens but the Chat tab shows a blank white panel.

Check 1: Node.js version

The MCP bridge requires Node.js 18 or later:

node --version
# Should print v18.x.x or higher

If Node.js is not installed or is too old, download it from nodejs.org.

Check 2: npm dependencies installed

cd Plugins/gengine/Resources/mcp-bridge
npm install
npm run build

If node_modules is missing or incomplete, the bridge fails silently and the panel has nothing to connect to.

Check 3: Bridge process running

In the Diagnostics tab, check the MCP Bridge row. If it shows red, the Node.js process is not running. Click Retry to attempt a restart. If it keeps failing, check the bridge log:

Bridge log locations:

PlatformPath
Windows%APPDATA%\Unreal Engine\gengine\bridge.log
Linux~/.config/UnrealEngine/gengine/bridge.log
macOS~/Library/Application Support/UnrealEngine/gengine/bridge.log

Check 4: Port conflict

If port 3000 is in use by another process, the bridge cannot start. Change McpBridgePort in Project Settings to an unused port (e.g., 3001) and update your browser URL accordingly.

# Find what's using port 3000 on Windows
netstat -ano | findstr :3000

Connection Failed / Cannot Connect to Provider

Symptom: Diagnostics shows AI Provider as red with "Unreachable" or "Auth error".

Unreachable

  1. Check your network connection.
  2. If using a local model (Ollama, LM Studio), verify the server is running:
    curl http://localhost:11434/api/tags   # Ollama
    curl http://localhost:1234/v1/models   # LM Studio
    
  3. Check that your firewall is not blocking the port.
  4. For Anthropic/OpenAI, test connectivity:
    curl -I https://api.anthropic.com
    curl -I https://api.openai.com
    

Auth error

  1. Verify your API key is correct — copy it directly from the provider's portal, no extra spaces.
  2. Check that the key has not expired or been revoked.
  3. For Anthropic, confirm you have credits in your account at console.anthropic.com.
  4. For OpenAI, check your usage limits at platform.openai.com/usage.

No tool support

The provider responded but rejected the tool definitions. This means the model does not support function calling / tool use. Switch to a model that does:

  • Anthropic: any Claude 3+ model
  • OpenAI: gpt-4o, gpt-4o-mini, o3
  • Ollama: mistral-small, llama3.2, qwen2.5-coder

Tools Not Showing / Missing Operations

Symptom: The AI says it cannot find a tool, or the Tools tab shows fewer than 97 operations.

Check 1: Tool Registry status

In the Diagnostics tab, check the Tool Registry row. If it shows a warning, some tools failed to register. The error detail shows which tools and why.

Check 2: License tier

On the Free tier without a license key, only 10 operations are available. Activate a license key to unlock all 97. See the Licensing page.

Check 3: MCP client discovery

If you're using the Claude CLI, run claude mcp list to confirm gengine is listed. If not, add it to ~/.claude/claude_desktop_config.json.

If you're using an OpenAI-compatible client, call the MCP tools/list endpoint manually:

curl http://localhost:3000/mcp \
  -H "Content-Type: application/json" \
  -d '{"method": "tools/list", "params": {}}'

The response should list all 8 tool definitions.

Node.js / Bridge Issues

Bridge crashes immediately

Check the bridge log for the error. Common causes:

  • Error: Cannot find module — run npm install in the mcp-bridge directory
  • EADDRINUSE — port 3000 is in use, change McpBridgePort in settings
  • Error: ENOENT — the dist/index.js file is missing, run npm run build

Bridge restarts repeatedly

If the bridge keeps crashing and restarting (visible in the Diagnostics tab as rapid state changes), there is likely a persistent crash in the Node.js process. Check the bridge log for a recurring error, then:

  1. Stop the bridge by setting bAutoStartBridge=False in Project Settings.
  2. Start it manually in a terminal to see the full error output:
    cd Plugins/gengine/Resources/mcp-bridge
    node dist/index.js
    
  3. Fix the root cause (usually a missing env variable or corrupt node_modules).

Slow responses from the bridge

The bridge compresses large payloads, but if asset lists are very large (thousands of results), responses can be slow. Reduce MaxSearchResults in Project Settings and use more specific search queries.

Log Locations

LogLocationContents
Unreal Output LogSaved/Logs/ProjectName.logPlugin startup, module loading, game thread errors
gengine REST API logSaved/Logs/gengine_api.logHTTP request/response, tool dispatch, validation errors
MCP Bridge log%APPDATA%\Unreal Engine\gengine\bridge.logNode.js process output, MCP wire messages
Panel log%APPDATA%\Unreal Engine\gengine\panel.logReact app errors, provider connection events

Tip: When reporting a bug, attach the gengine REST API log and the MCP Bridge log. These two logs together capture the full lifecycle of a failed tool call.

Getting More Help

  • Diagnostics tab — always the first stop; shows the specific error with fix suggestions.
  • Activity tab — shows what parameters were actually sent to each tool call; useful for "why did the AI do that?"
  • Community forum — post in the gengine channel with your bridge log and a description of the issue.
  • Email support (Pro)support@gengine.dev with your license key and logs attached.