databasesanalytics-data

Rasdaman MCP Server

by rasdaman

Rasdaman MCP Server: interact with rasdaman multidimensional array databases via natural language to list coverages, get

Enables natural language interaction with rasdaman multidimensional databases by translating tool calls into WCS/WCPS queries. It allows users to list coverages, retrieve metadata, and execute complex queries on datacubes through an LLM.

github stars

4

Natural language interface to rasdamanAutomatic query translation to WCS/WCPS

best for

  • / Geospatial analysts working with satellite data
  • / Researchers querying Earth observation archives
  • / Data scientists analyzing multidimensional datasets

capabilities

  • / List available datacubes and coverages
  • / Retrieve metadata and dimensions of datasets
  • / Execute complex spatial-temporal queries
  • / Generate NDVI and other computed imagery
  • / Query satellite data using natural language
  • / Translate requests to WCS/WCPS automatically

what it does

Connects LLMs to rasdaman multidimensional databases, allowing natural language queries on datacubes and satellite imagery through automatic WCS/WCPS translation.

about

Rasdaman MCP Server is an official MCP server published by rasdaman that provides AI assistants with tools and capabilities via the Model Context Protocol. Rasdaman MCP Server: interact with rasdaman multidimensional array databases via natural language to list coverages, get It is categorized under databases, analytics data.

how to install

You can install Rasdaman MCP Server in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

Rasdaman MCP Server is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

Rasdaman MCP Server

This tool enables users to interact with rasdaman in a natural language context. By exposing rasdaman functionality as tools via the MCP protocol, an LLM can query the database to answer questions like:

  • "What datacubes are available?"
  • "What are the dimensions of the 'Sentinel2_10m' coverage?"
  • "Create an NDVI image for June 12, 2025."

The MCP server translates these tool calls into actual WCS/WCPS queries that rasdaman can understand and then returns the results to the LLM.

Installation

pip install rasdaman-mcp

Usage

First the connection from the MCP server to rasdaman needs to be configured, either through environment variables:

  • RASDAMAN_URL: URL for the rasdaman server
  • RASDAMAN_USERNAME: Username for authentication
  • RASDAMAN_PASSWORD: Password for authentication

or command-line arguments to the rasdaman-mcp tool:

  • --rasdaman-url: URL for the rasdaman server (default RASDAMAN_URL env variable or http://localhost:8080/rasdaman/ows).
  • --username: Username for authentication (default RASDAMAN_USERNAME env variable or rasguest).
  • --password: Sets the password for authentication (default RASDAMAN_PASSWORD env variable or rasguest).

Then the MCP is ready to be used with an AI agent tool, in one of two modes: stdio (default) or http.

stdio Mode

Used for direct integration with clients that take over managing the server process and communicate with it through standard input/output. Generally in your AI tool you need to specify the command to run rasdaman-mcp:

rasdaman-mcp --username rasguest --password rasguest --rasdaman-url "..."

Example for enabling it in gemini-cli:

gemini mcp add rasdaman-mcp "rasdaman-mcp --username rasguest --password rasguest"

Benefits:

  • Simplicity: No need to manage a separate server process or ports.
  • Seamless Integration: Tools are transparently made available to the LLM within the client environment.

http Mode

This mode starts a standalone Web server listening on a specified host/port, e.g:

rasdaman-mcp --transport http --host 127.0.0.1 --port 8000 --rasdaman-url "..."

The MCP server URL to be configured in your AI agent would be http://127.0.0.1:8000/mcp with transport streamable-http. For example, for Mistral Vibe extend the config.toml with a section like this:

[[mcp_servers]]
name = "rasdaman-mcp"
transport = "streamable-http"
url = "http://127.0.0.1:8000/mcp/"

Benefits:

  • Scalability: The MCP server can be containerized (e.g., with Docker) and deployed as a separate microservice.
  • Decoupling: Any client that can speak HTTP (e.g., curl, Python scripts, web apps, other LLM clients) can interact with the tools.
  • Testing: Allows for direct API testing and debugging, independent of an LLM client.

AI agents

Once an AI agent is configured with access to rasdaman-mcp, it becomes capable of using several tools:

  • list coverages in the configured rasdaman
  • get the details of a particular coverage
  • execute processing/analytics queries based on a description in natural language

Examples

The following examples demonstrate the interaction with an AI agent using the rasdaman MCP server.

Listing Coverages

Describing a Coverage

Executing a Query

Query Result Visualization

Natural Language Query Suggestion

Development

Setup

  1. Clone the git repository:

    git clone https://github.com/rasdaman/rasdaman-mcp.git
    cd rasdaman-mcp/
    
  2. Create a virtual environment (if you don't have one):

    uv venv
    
  3. Activate the virtual environment:

    source .venv/bin/activate
    
  4. Install from source:

    uv pip install -e .
    

Core Components

  • Main Application (main.py): This script initializes the FastMCP application. It handles command-line arguments for transport selection, rasdaman URL, username, and password. It then instantiates the RasdamanActions class and decorates its methods to expose them as tools.
  • RasdamanActions Class (rasdaman_actions.py): Encapsulates all interaction with the rasdaman WCS/WCPS endpoints. It is initialized with the server URL and credentials, and its methods contain the logic for listing coverages, describing them, and executing queries.
  • WCPS crash course (wcps_crash_course.py): A short summary of the syntax of WCPS, allowing LLMs to generate more accurate queries.

Defined Tools

The following methods are exposed as tools:

  • list_coverages(): Lists all available datacubes.
  • describe_coverage(coverage_id): Retrieves metadata for a specific datacube.
  • wcps_query_crash_course(): Returns a crash course on WCPS syntax with examples and best practices.
  • execute_wcps_query(wcps_query): Executes a raw WCPS query and returns a result either directly as a string (scalars or small json), or as a filepath.

Documentation

To build the documentation:

# install dependencies
uv pip install '.[docs]'

sphinx-build docs docs/_build

You can then open docs/_build/index.html in the browser.

Automated Tests

To run the tests:

# install dependencies
uv pip install '.[tests]'

pytest

Manual Testing

Interacting with the standalone HTTP server manually requires a specific 3-step process using curl. The fastmcp protocol is stateful and requires a session to be explicitly initialized.

  1. First, send an initialize request. This will return a 200 OK response and, most importantly, a session ID in the mcp-session-id response header (needed in the next steps).

    curl -i -X POST \
    -H "Accept: text/event-stream, application/json" \
    -H "Content-Type: application/json" \
    -d '{
          "jsonrpc": "2.0",
          "method": "initialize",
          "params": {
            "protocolVersion": "2024-11-05",
            "capabilities": {},
            "clientInfo": { "name": "curl-client", "version": "1.0.0" }
          },
          "id": 1
        }' \
    "http://127.0.0.1:8000/mcp"
    
  2. Next, send a notification to the server to confirm the session is ready. Use the session ID from Step 1 in the mcp-session-id header. This request will not produce a body in the response.

    SESSION_ID="<YOUR_SESSION_ID>"
    
    curl -X POST \
    -H "Accept: text/event-stream, application/json" \
    -H "Content-Type: application/json" \
    -H "Mcp-Session-Id: $SESSION_ID" \
    -d '{
          "jsonrpc": "2.0",
          "method": "notifications/initialized"
        }' \
    "http://127.0.0.1:8000/mcp"
    
  3. Finally, you can call a tool using the tools/call method. The params object must contain the name of the tool and an arguments object with the parameters for that tool. The server will respond with the result of the tool call in a JSON-RPC response.

    SESSION_ID="<YOUR_SESSION_ID>"
    
    # Example: Calling the 'list_coverages' tool
    curl -X POST \
    -H "Accept: text/event-stream, application/json" \
    -H "Content-Type: application/json" \
    -H "Mcp-Session-Id: $SESSION_ID" \
    -d '{
          "jsonrpc": "2.0",
          "method": "tools/call",
          "params": {
            "name": "list_coverages",
            "arguments": {}
          },
          "id": 2
        }' \
    "http://127.0.0.1:8000/mcp"
    

FAQ

What is the Rasdaman MCP Server MCP server?
Rasdaman MCP Server is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
How do MCP servers relate to agent skills?
Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
How are reviews shown for Rasdaman MCP Server?
This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
MCP server reviews

Ratings

4.510 reviews
  • Shikha Mishra· Oct 10, 2024

    Rasdaman MCP Server is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.

  • Piyush G· Sep 9, 2024

    We evaluated Rasdaman MCP Server against two servers with overlapping tools; this profile had the clearer scope statement.

  • Chaitanya Patil· Aug 8, 2024

    Useful MCP listing: Rasdaman MCP Server is the kind of server we cite when onboarding engineers to host + tool permissions.

  • Sakshi Patil· Jul 7, 2024

    Rasdaman MCP Server reduced integration guesswork — categories and install configs on the listing matched the upstream repo.

  • Ganesh Mohane· Jun 6, 2024

    I recommend Rasdaman MCP Server for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.

  • Oshnikdeep· May 5, 2024

    Strong directory entry: Rasdaman MCP Server surfaces stars and publisher context so we could sanity-check maintenance before adopting.

  • Dhruvi Jain· Apr 4, 2024

    Rasdaman MCP Server has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.

  • Rahul Santra· Mar 3, 2024

    According to our notes, Rasdaman MCP Server benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.

  • Pratham Ware· Feb 2, 2024

    We wired Rasdaman MCP Server into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.

  • Yash Thakker· Jan 1, 2024

    Rasdaman MCP Server is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.