JinaAI▌

by spences10
JinaAI offers advanced web scraping tools and software for efficient extraction and parsing of web page content and data
Extracts and processes web content for efficient parsing and analysis of online information
best for
- / Analyzing web documentation and articles
- / Processing online content for AI workflows
- / Converting web pages for LLM analysis
capabilities
- / Extract text content from any URL
- / Convert web pages to LLM-friendly format
- / Preserve document structure during extraction
- / Process various content types including documentation
what it does
Extracts and converts web content from URLs into clean, structured text that's optimized for LLM processing using Jina.ai's Reader API.
about
JinaAI is a community-built MCP server published by spences10 that provides AI assistants with tools and capabilities via the Model Context Protocol. JinaAI offers advanced web scraping tools and software for efficient extraction and parsing of web page content and data It is categorized under search web.
how to install
You can install JinaAI in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.
license
MIT
JinaAI is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
mcp-jinaai-reader
⚠️ Notice
This repository is no longer maintained.
The functionality of this tool is now available in mcp-omnisearch, which combines multiple MCP tools in one unified package.
Please use mcp-omnisearch instead.
A Model Context Protocol (MCP) server for integrating Jina.ai's Reader API with LLMs. This server provides efficient and comprehensive web content extraction capabilities, optimized for documentation and web content analysis.
<a href="https://glama.ai/mcp/servers/a75afsx9cx"> <img width="380" height="200" src="https://glama.ai/mcp/servers/a75afsx9cx/badge" /> </a>Features
- 📚 Advanced web content extraction through Jina.ai Reader API
- 🚀 Fast and efficient content retrieval
- 📄 Complete text extraction with preserved structure
- 🔄 Clean format optimized for LLMs
- 🌐 Support for various content types including documentation
- 🏗️ Built on the Model Context Protocol
Configuration
This server requires configuration through your MCP client. Here are examples for different environments:
Cline Configuration
Add this to your Cline MCP settings:
{
"mcpServers": {
"jinaai-reader": {
"command": "node",
"args": ["-y", "mcp-jinaai-reader"],
"env": {
"JINAAI_API_KEY": "your-jinaai-api-key"
}
}
}
}
Claude Desktop with WSL Configuration
For WSL environments, add this to your Claude Desktop configuration:
{
"mcpServers": {
"jinaai-reader": {
"command": "wsl.exe",
"args": [
"bash",
"-c",
"JINAAI_API_KEY=your-jinaai-api-key npx mcp-jinaai-reader"
]
}
}
}
Environment Variables
The server requires the following environment variable:
JINAAI_API_KEY: Your Jina.ai API key (required)
API
The server implements a single MCP tool with configurable parameters:
read_url
Convert any URL to LLM-friendly text using Jina.ai Reader.
Parameters:
url(string, required): URL to processno_cache(boolean, optional): Bypass cache for fresh results. Defaults to falseformat(string, optional): Response format ("json" or "stream"). Defaults to "json"timeout(number, optional): Maximum time in seconds to wait for webpage loadtarget_selector(string, optional): CSS selector to focus on specific elementswait_for_selector(string, optional): CSS selector to wait for specific elementsremove_selector(string, optional): CSS selector to exclude specific elementswith_links_summary(boolean, optional): Gather all links at the end of responsewith_images_summary(boolean, optional): Gather all images at the end of responsewith_generated_alt(boolean, optional): Add alt text to images lacking captionswith_iframe(boolean, optional): Include iframe content in response
Development
Setup
- Clone the repository
- Install dependencies:
npm install
- Build the project:
npm run build
- Run in development mode:
npm run dev
Publishing
- Update version in package.json
- Build the project:
npm run build
- Publish to npm:
npm publish
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
MIT License - see the LICENSE file for details.
Acknowledgments
- Built on the Model Context Protocol
- Powered by Jina.ai Reader API