ai-mldeveloper-tools

AI Translation

by datanoisetv

AI Translation offers an advanced AI translation and machine translation service, auto-translating JSON files with perfe

Translates JSON internationalization files using multiple translation providers (Google Gemini, OpenAI, Ollama/DeepSeek) with intelligent caching, deduplication across files, and format preservation to minimize API costs while maintaining exact JSON structure and consistent results across target languages.

github stars

6

Multiple AI providers including local OllamaSmart caching reduces API costsCross-platform with automatic setup

best for

  • / Frontend developers localizing applications
  • / Teams managing multi-language i18n files
  • / Projects requiring cost-effective bulk translation
  • / Maintaining consistent translations across multiple JSON files

capabilities

  • / Translate JSON internationalization files to multiple languages
  • / Process multiple files with automatic deduplication
  • / Cache translations incrementally to avoid re-translating
  • / Batch translations for optimal API performance
  • / Preserve JSON structure and formatting exactly
  • / Detect source language automatically

what it does

Translates JSON i18n files using AI providers (Google Gemini, OpenAI, Ollama/DeepSeek) while preserving exact JSON structure and minimizing API costs through intelligent caching and deduplication.

about

AI Translation is a community-built MCP server published by datanoisetv that provides AI assistants with tools and capabilities via the Model Context Protocol. AI Translation offers an advanced AI translation and machine translation service, auto-translating JSON files with perfe It is categorized under ai ml, developer tools.

how to install

You can install AI Translation in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

NOASSERTION

AI Translation is released under the NOASSERTION license.

readme

translator-ai

CI npm version Buy Me A Coffee

Fast and efficient JSON i18n translator supporting multiple AI providers (Google Gemini, OpenAI & Ollama/DeepSeek) with intelligent caching, multi-file deduplication, and MCP integration.

Features

  • Multiple AI Providers: Choose between Google Gemini, OpenAI (cloud) or Ollama/DeepSeek (local) for translations
  • Multi-File Support: Process multiple files with automatic deduplication to save API calls
  • Incremental Caching: Only translates new or modified strings, dramatically reducing API calls
  • Batch Processing: Intelligently batches translations for optimal performance
  • Path Preservation: Maintains exact JSON structure including nested objects and arrays
  • Cross-Platform: Works on Windows, macOS, and Linux with automatic cache directory detection
  • Developer Friendly: Built-in performance statistics and progress indicators
  • Cost Effective: Minimizes API usage through smart caching and deduplication
  • Language Detection: Automatically detect source language instead of assuming English
  • Multiple Target Languages: Translate to multiple languages in a single command
  • Translation Metadata: Optionally include translation details in output files for tracking
  • Dry Run Mode: Preview what would be translated without making API calls
  • Format Preservation: Maintains URLs, emails, dates, numbers, and template variables unchanged

Installation

Global Installation (Recommended)

npm install -g translator-ai

Local Installation

npm install translator-ai

Configuration

Option 1: Google Gemini API (Cloud)

Create a .env file in your project root or set the environment variable:

GEMINI_API_KEY=your_gemini_api_key_here

Get your API key from Google AI Studio.

Option 2: OpenAI API (Cloud)

Create a .env file in your project root or set the environment variable:

OPENAI_API_KEY=your_openai_api_key_here

Get your API key from OpenAI Platform.

Option 3: Ollama with DeepSeek-R1 (Local)

For completely local translation without API costs:

  1. Install Ollama
  2. Pull the DeepSeek-R1 model:
    ollama pull deepseek-r1:latest
    
  3. Use the --provider ollama flag:
    translator-ai source.json -l es -o spanish.json --provider ollama
    

Usage

Basic Usage

# Translate a single file
translator-ai source.json -l es -o spanish.json

# Translate multiple files with deduplication
translator-ai src/locales/en/*.json -l es -o "{dir}/{name}.{lang}.json"

# Use glob patterns
translator-ai "src/**/*.en.json" -l fr -o "{dir}/{name}.fr.json"

Command Line Options

translator-ai <inputFiles...> [options]

Arguments:
  inputFiles                   Path(s) to source JSON file(s) or glob patterns

Options:
  -l, --lang <langCodes>      Target language code(s), comma-separated for multiple
  -o, --output <pattern>      Output file path or pattern
  --stdout                    Output to stdout instead of file
  --stats                     Show detailed performance statistics
  --no-cache                  Disable incremental translation cache
  --cache-file <path>         Custom cache file path
  --provider <type>           Translation provider: gemini, openai, or ollama (default: gemini)
  --ollama-url <url>          Ollama API URL (default: http://localhost:11434)
  --ollama-model <model>      Ollama model name (default: deepseek-r1:latest)
  --gemini-model <model>      Gemini model name (default: gemini-2.0-flash-lite)
  --openai-model <model>      OpenAI model name (default: gpt-4o-mini)
  --list-providers            List available translation providers
  --verbose                   Enable verbose output for debugging
  --detect-source             Auto-detect source language instead of assuming English
  --dry-run                   Preview what would be translated without making API calls
  --preserve-formats          Preserve URLs, emails, numbers, dates, and other formats
  --metadata                  Add translation metadata to output files (may break some i18n parsers)
  --sort-keys                 Sort output JSON keys alphabetically
  --check-keys                Verify all source keys exist in output (exit with error if keys are missing)
  -h, --help                  Display help
  -V, --version               Display version

Output Pattern Variables (for multiple files):
  {dir}   - Original directory path
  {name}  - Original filename without extension
  {lang}  - Target language code

Examples

Translate a single file

translator-ai en.json -l es -o es.json

Translate multiple files with pattern

# All JSON files in a directory
translator-ai locales/en/*.json -l es -o "locales/es/{name}.json"

# Recursive glob pattern
translator-ai "src/**/en.json" -l fr -o "{dir}/fr.json"

# Multiple specific files
translator-ai file1.json file2.json file3.json -l de -o "{name}.de.json"

Translate with deduplication savings

# Shows statistics including how many API calls were saved
translator-ai src/i18n/*.json -l ja -o "{dir}/{name}.{lang}.json" --stats

Output to stdout (useful for piping)

translator-ai en.json -l de --stdout > de.json

Parse output with jq

translator-ai en.json -l de --stdout | jq

Disable caching for fresh translation

translator-ai en.json -l ja -o ja.json --no-cache

Use custom cache location

translator-ai en.json -l ko -o ko.json --cache-file /path/to/cache.json

Use Ollama for local translation

# Basic usage with Ollama
translator-ai en.json -l es -o es.json --provider ollama

# Use a different Ollama model
translator-ai en.json -l fr -o fr.json --provider ollama --ollama-model llama2:latest

# Connect to remote Ollama instance
translator-ai en.json -l de -o de.json --provider ollama --ollama-url http://192.168.1.100:11434

# Check available providers
translator-ai --list-providers

Advanced Features

# Detect source language automatically
translator-ai content.json -l es -o spanish.json --detect-source

# Translate to multiple languages at once
translator-ai en.json -l es,fr,de,ja -o translations/{lang}.json

# Dry run - see what would be translated without making API calls
translator-ai en.json -l es -o es.json --dry-run

# Preserve formats (URLs, emails, dates, numbers, template variables)
translator-ai app.json -l fr -o app-fr.json --preserve-formats

# Include translation metadata (disabled by default to ensure compatibility)
translator-ai en.json -l fr -o fr.json --metadata

# Sort keys alphabetically for consistent output
translator-ai en.json -l fr -o fr.json --sort-keys

# Verify all keys are present in the translation
translator-ai en.json -l fr -o fr.json --check-keys

# Use a different Gemini model
translator-ai en.json -l es -o es.json --gemini-model gemini-2.5-flash

# Combine features
translator-ai src/**/*.json -l es,fr,de -o "{dir}/{name}.{lang}.json" \
  --detect-source --preserve-formats --stats --check-keys

Available Gemini Models

The --gemini-model option allows you to choose from various Gemini models. Popular options include:

  • gemini-2.0-flash-lite (default) - Fast and efficient for most translations
  • gemini-2.5-flash - Enhanced performance with newer capabilities
  • gemini-pro - More sophisticated understanding for complex translations
  • gemini-1.5-pro - Previous generation pro model
  • gemini-1.5-flash - Previous generation fast model

Example usage:

# Use the latest flash model
translator-ai en.json -l es -o es.json --gemini-model gemini-2.5-flash

# Use the default lightweight model
translator-ai en.json -l fr -o fr.json --gemini-model gemini-2.0-flash-lite

Available OpenAI Models

The --openai-model option allows you to choose from various OpenAI models. Popular options include:

  • gpt-4o-mini (default) - Cost-effective and fast for most translations
  • gpt-4o - Most capable model with advanced understanding
  • gpt-4-turbo - Previous generation flagship model
  • gpt-3.5-turbo - Fast and efficient for simpler translations

Example usage:

# Use OpenAI with the default model
translator-ai en.json -l es -o es.json --provider openai

# Use GPT-4o for complex translations
translator-ai en.json -l ja -o ja.json --provider openai --openai-model gpt-4o

# Use GPT-3.5-turbo for faster, simpler translations
translator-ai en.json -l fr -o fr.json --provider openai --openai-model gpt-3.5-turbo

Translation Metadata

When enabled with the --metadata flag, translator-ai adds metadata to help track translations:

{
  "_translator_metadata": {
    "tool": "translator-ai v1.1.0",
    "repository": "https://github.com/DatanoiseTV/translator-ai",
    "provider": "Google Gemini",
    "source_language": "English",
    "target_language": "fr",
    "timestamp": "2025-06-20T12:34:56.789Z",
    "total_strings": 42,
    "source_file": "en.json"
  },
  "greeting": "Bonjour",
  "farewell": "Au revoir"
}

Metadata is disabled by default to ensure compatibility with i18n parsers. Use --metadata to enable it.

Key Sorting

Use the --sort-keys flag to sort all JSON keys alphabetically in the output:

translator-ai en.json -l es -o es.json --sort-keys

This ensures consistent ordering across translations and makes diffs cleaner. Keys are sorted:

  • Case-insensitiv

FAQ

What is the AI Translation MCP server?
AI Translation is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for AI Translation?
This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.510 reviews
  • Shikha Mishra· Oct 10, 2024

    AI Translation is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Piyush G· Sep 9, 2024

    We evaluated AI Translation against two servers with overlapping tools; this profile had the clearer scope statement.

  • Chaitanya Patil· Aug 8, 2024

    Useful MCP listing: AI Translation is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Sakshi Patil· Jul 7, 2024

    AI Translation reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Ganesh Mohane· Jun 6, 2024

    I recommend AI Translation for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Oshnikdeep· May 5, 2024

    Strong directory entry: AI Translation surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Dhruvi Jain· Apr 4, 2024

    AI Translation has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Rahul Santra· Mar 3, 2024

    According to our notes, AI Translation benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Pratham Ware· Feb 2, 2024

    We wired AI Translation into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Yash Thakker· Jan 1, 2024

    AI Translation is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.