AI Tools
General Translation support for AI Tools including llms.txt and MCP server
Overview
We provide several tools to help your AI tools work with General Translation.
llms.txt
Provide your AI tools with our llms.txt file for a brief summary of our docs in a LLM-friendly format.
llms-full.txt
Provide your AI tools with our llms-full.txt file for the full content of our docs in a LLM-friendly format.
MCP Server
We offer a simple MCP server that AI tools like Cursor, Windsurf, and Claude Code can use to access our docs.
Add the following to your mcp config file to use our MCP server.
Local MCP Server
For AI tools which maintain a persistent connection such as Windsurf, Cursor, and Claude Code, you can run our MCP docs server locally.
{
"mcpServers": {
"generaltranslation": {
"command": "npx",
"args": ["-y", "@generaltranslation/mcp@latest"]
}
}
}
Streamable-HTTP
Otherwise, you can use our MCP server hosted at https://mcp.gtx.dev
.
{
"mcpServers": {
"generaltranslation": {
"type": "streamable-http",
"url": "https://mcp.gtx.dev"
}
}
}
SSE
For tools that don't support streamable-http, you can use the SSE endpoint instead.
{
"mcpServers": {
"generaltranslation": {
"type": "sse",
"url": "https://mcp.gtx.dev/sse"
}
}
}
Using the MCP Server
Make sure you've added the MCP server to your mcp config file.
Cursor
To use the MCP server in Cursor, simply ask the AI to use the generaltranslation
tool.
For example, you can ask the AI: "Use the generaltranslation tool to explain how to use a <T>
component."
Windsurf
To use the MCP server in Windsurf, ask the AI to use the generaltranslation
mcp server.
For example, you can ask the AI: "Use the generaltranslation mcp server to explain how to use a <T>
component."
Claude Code
To use the MCP server in Claude Code, ask the AI to use the generaltranslation
mcp server.
How is this guide?