AI Integrations
Resources for AI coding tools including llms.txt and an MCP server
We provide several resources to help your AI tools work with General Translation.
llms.txt
llms.txt
Provide your AI coding tools with our llms.txt file for a brief summary of our docs in a LLM-friendly format.
llms-full.txt
Provide your AI coding tools with our llms-full.txt file for the full content of our docs in a LLM-friendly format.
MCP server
We offer a simple MCP server that AI tools like Cursor, Windsurf, and Claude Code can use to access our docs.
Configuring the MCP server
Local MCP server
For AI tools which maintain a persistent connection such as Windsurf, Cursor, and Claude Code, you can run our MCP docs server locally.
{
"mcpServers": {
"generaltranslation": {
"command": "npx",
"args": ["-y", "@generaltranslation/mcp@latest"]
}
}
}Streamable-HTTP MCP
Otherwise, you can use our MCP server hosted at https://mcp.gtx.dev.
{
"mcpServers": {
"generaltranslation": {
"type": "streamable-http",
"url": "https://mcp.gtx.dev"
}
}
}SSE MCP
For tools that don't support streamable-http, you can use the SSE endpoint instead.
{
"mcpServers": {
"generaltranslation": {
"type": "sse",
"url": "https://mcp.gtx.dev/sse"
}
}
}Using the MCP server
Cursor
To use the MCP server in Cursor, simply ask the AI to use the generaltranslation tool.
For example, you can ask the AI: "Use the generaltranslation tool to explain how to use a <T> component."
Windsurf
To use the MCP server in Windsurf, ask the AI to use the generaltranslation mcp server.
For example, you can ask the AI: "Use the generaltranslation mcp server to explain how to use a <T> component."
Claude Code
To use the MCP server in Claude Code, ask the AI to use the generaltranslation mcp server.