Crawleo▌

by crawleo
Crawleo — real-time web search and website crawling with zero data retention. Fast, private site crawling and live searc
Provides real-time web search and website crawling capabilities with zero data retention.
best for
- / AI assistants needing live web data access
- / Research and content gathering workflows
- / Market research and competitive analysis
- / Real-time information retrieval for chatbots
capabilities
- / Search the web in real-time from any country/language
- / Crawl and extract content from any URL with JavaScript rendering
- / Output results in multiple formats (HTML, Markdown, Plain Text)
- / View websites from different devices (desktop, mobile, tablet)
- / Auto-crawl search results for deeper content extraction
what it does
Enables AI assistants to perform real-time web searches and extract content from websites with JavaScript rendering support. Offers multiple output formats and ensures zero data retention for privacy.
about
Crawleo is an official MCP server published by crawleo that provides AI assistants with tools and capabilities via the Model Context Protocol. Crawleo — real-time web search and website crawling with zero data retention. Fast, private site crawling and live searc It is categorized under search web.
how to install
You can install Crawleo in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server supports remote connections over HTTP, so no local installation is required.
license
MIT
Crawleo is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.
readme
Crawleo MCP Server
<a href="https://glama.ai/mcp/servers/@crawleo/crawleo-mcp"> <img width="380" height="200" src="https://glama.ai/mcp/servers/@crawleo/crawleo-mcp/badge" alt="Crawleo MCP server" /> </a>Real-time web search and crawling capabilities for AI assistants through Model Context Protocol (MCP).
Overview
Crawleo MCP enables AI assistants to access live web data through two powerful tools:
- web.search - Real-time web search with multiple output formats
- web.crawl - Deep content extraction from any URL
Features
✅ Real-time web search from any country/language
✅ Multiple output formats - Enhanced HTML, Raw HTML, Markdown, Plain Text
✅ Device-specific results - Desktop, mobile, or tablet view
✅ Deep content extraction with JavaScript rendering
✅ Zero data retention - Complete privacy
✅ Auto-crawling option for search results
Installation
Option 1: NPM (Recommended for local usage)
Install globally via npm:
npm install -g crawleo-mcp
Or use npx without installing:
npx crawleo-mcp
Option 2: Clone Repository
git clone https://github.com/Crawleo/Crawleo-MCP.git
cd Crawleo-MCP
npm install
npm run build
Option 3: Docker
Build and run using Docker:
# Build the image
docker build -t crawleo-mcp .
# Run with your API key
docker run -e CRAWLEO_API_KEY=your_api_key crawleo-mcp
Docker configuration for MCP clients:
{
"mcpServers": {
"crawleo": {
"command": "docker",
"args": ["run", "-i", "--rm", "-e", "CRAWLEO_API_KEY=YOUR_API_KEY_HERE", "crawleo-mcp"]
}
}
}
Option 4: Remote Server (No installation needed)
Use the hosted version at https://api.crawleo.dev/mcp - see configuration examples below.
Getting Your API Key
- Visit crawleo.dev
- Sign up for a free account
- Navigate to your dashboard
- Copy your API key (starts with
sk_)
Setup Instructions
Using Local MCP Server (npm package)
After installing via npm, configure your MCP client to use the local server:
Claude Desktop / Cursor / Windsurf (Local):
{
"mcpServers": {
"crawleo": {
"command": "npx",
"args": ["crawleo-mcp"],
"env": {
"CRAWLEO_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
Or if installed globally:
{
"mcpServers": {
"crawleo": {
"command": "crawleo-mcp",
"env": {
"CRAWLEO_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
From cloned repository:
{
"mcpServers": {
"crawleo": {
"command": "node",
"args": ["/path/to/Crawleo-MCP/dist/index.js"],
"env": {
"CRAWLEO_API_KEY": "YOUR_API_KEY_HERE"
}
}
}
}
Using Remote Server (Hosted)
1. Claude Desktop
Location of config file:
- macOS:
~/Library/Application Support/Claude/claude_desktop_config.json - Windows:
%APPDATA%\Claude\claude_desktop_config.json - Linux:
~/.config/Claude/claude_desktop_config.json
Configuration:
{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Replace YOUR_API_KEY_HERE with your actual API key from crawleo.dev.
Steps:
- Open the config file in a text editor
- Add the Crawleo MCP configuration
- Save the file
- Restart Claude Desktop completely (quit and reopen)
- Start a new conversation and ask Claude to search the web!
Example usage:
"Search for the latest AI news and summarize the top 5 articles"
"Find Python web scraping tutorials and extract code examples"
2. Cursor IDE
Location of config file:
- macOS:
~/.cursor/config.jsonor~/Library/Application Support/Cursor/config.json - Windows:
%APPDATA%\Cursor\config.json - Linux:
~/.config/Cursor/config.json
Configuration:
{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Steps:
- Locate and open your Cursor config file
- Add the Crawleo MCP configuration
- Save the file
- Restart Cursor
- The MCP tools will be available in your AI assistant
Example usage in Cursor:
"Search for React best practices and add them to my code comments"
"Find the latest documentation for this API endpoint"
3. Windsurf IDE
Location of config file:
- macOS:
~/Library/Application Support/Windsurf/config.json - Windows:
%APPDATA%\Windsurf\config.json - Linux:
~/.config/Windsurf/config.json
Configuration:
{
"mcpServers": {
"crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Steps:
- Open the Windsurf config file
- Add the Crawleo MCP server configuration
- Save and restart Windsurf
- Start using web search in your coding workflow
4. GitHub Copilot
Location of config file:
For GitHub Copilot in VS Code or compatible editors, you need to configure MCP servers.
Configuration:
Create or edit your MCP config file and add:
{
"servers": {
"Crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Complete example with multiple servers:
{
"servers": {
"Crawleo": {
"url": "https://api.crawleo.dev/mcp",
"transport": "http",
"headers": {
"Authorization": "Bearer YOUR_API_KEY_HERE"
}
}
}
}
Steps:
- Open your GitHub Copilot MCP configuration
- Add the Crawleo server configuration
- Save the file
- Restart VS Code or your IDE
- GitHub Copilot can now use Crawleo for web searches!
Example usage:
Ask Copilot: "Search for the latest Python best practices"
Ask Copilot: "Find documentation for this library"
5. OpenAI Platform (Direct Integration)
OpenAI now supports MCP servers directly! Here's how to use Crawleo with OpenAI's API:
Python Example:
from openai import OpenAI
client = OpenAI()
response = client.responses.create(
model="gpt-4",
input=[
{
"role": "user",
"content": [
{
"type": "input_text",
"text": "search for latest news about openai models"
}
]
}
],
text={
"format": {
"type": "text"
},
"verbosity": "medium"
},
reasoning={
"effort": "medium"
},
tools=[
{
"type": "mcp",
"server_label": "Crawleo",
"server_url": "https://api.crawleo.dev/mcp",
"server_description": "Crawleo MCP Server - Real-Time Web Knowledge for AI",
"authorization": "YOUR_API_KEY_HERE",
"allowed_tools": [
"web.search",
"web.crawl"
],
"require_approval": "always"
}
],
store=True,
include=[
"reasoning.encrypted_content",
"web_search_call.action.sources"
]
)
print(response)
Key Parameters:
server_url- Crawleo MCP endpointauthorization- Your Crawleo API keyallowed_tools- Enableweb.searchand/orweb.crawlrequire_approval- Set to "always", "never", or "conditional"
Node.js Example:
import OpenAI from 'openai';
const client = new OpenAI();
const response = await client.responses.create({
model: 'gpt-4',
input: [
{
role: 'user',
content: [
{
type: 'input_text',
text: 'search for latest AI developments'
}
]
}
],
tools: [
{
type: 'mcp',
server_label: 'Crawleo',
server_url: 'https://api.crawleo.dev/mcp',
server_description: 'Crawleo MCP Server - Real-Time Web Knowledge for AI',
authorization: 'YOUR_API_KEY_HERE',
allowed_tools: ['web.search', 'web.crawl'],
require_approval: 'always'
}
]
});
console.log(response);
Available Tools
web.search
Search the web in real-time with customizable parameters.
Parameters:
query(required) - Search termmax_pages- Number of result pages (default: 1)setLang- Language code (e.g., "en", "ar")cc- Country code (e.g., "US", "EG")device- Device type: "desktop", "mobile", "tablet" (default: "desktop")enhanced_html- Get clean HTML (default: true)raw_html- Get raw HTML (default: false)markdown- Get Markdown format (default: true)page_text- Get plain text (default: false)auto_crawling- Auto-crawl result URLs (default: false)
Example:
Ask your AI: "Search for 'Python web scraping' and return results in Markdown"
web.crawl
Extract content from specific URLs.
Parameters:
urls(required) - List of URLs to crawlrawHtml- Return raw HTML (default: false)markdown- Convert to Markdown (default: false)screenshot- Capture screenshot (optional)country- Geographic location
Example:
Ask your AI: "Crawl https://example.com and extract the main content in Markdown"
Troubleshooting
MCP server not appearing
- Check config file location - Make sure you're editing the correct file
- Verify JSON syntax - Use a JSON validator to check for syntax errors
- Restart the application - Completely quit and reopen (not just reload)
- Check API key - Ensure your API key is valid
FAQ
- What is the Crawleo MCP server?
- Crawleo is a Model Context Protocol (MCP) server profile on explainx.ai. MCP lets AI hosts (e.g. Claude Desktop, Cursor) call tools and resources through a standard interface; this page summarizes categories, install hints, and community ratings.
- How do MCP servers relate to agent skills?
- Skills are reusable instruction packages (often SKILL.md); MCP servers expose live capabilities. Teams frequently combine both—skills for workflows, MCP for APIs and data. See explainx.ai/skills and explainx.ai/mcp-servers for parallel directories.
- How are reviews shown for Crawleo?
- This profile displays 10 aggregated ratings (sample rows for discoverability plus signed-in user reviews). Average score is about 4.5 out of 5—verify behavior in your own environment before production use.
Ratings
4.5★★★★★10 reviews- ★★★★★Shikha Mishra· Oct 10, 2024
Crawleo is among the better-indexed MCP projects we tried; the explainx.ai summary tracks the official description.
- ★★★★★Piyush G· Sep 9, 2024
We evaluated Crawleo against two servers with overlapping tools; this profile had the clearer scope statement.
- ★★★★★Chaitanya Patil· Aug 8, 2024
Useful MCP listing: Crawleo is the kind of server we cite when onboarding engineers to host + tool permissions.
- ★★★★★Sakshi Patil· Jul 7, 2024
Crawleo reduced integration guesswork — categories and install configs on the listing matched the upstream repo.
- ★★★★★Ganesh Mohane· Jun 6, 2024
I recommend Crawleo for teams standardizing on MCP; the explainx.ai page compares cleanly with sibling servers.
- ★★★★★Oshnikdeep· May 5, 2024
Strong directory entry: Crawleo surfaces stars and publisher context so we could sanity-check maintenance before adopting.
- ★★★★★Dhruvi Jain· Apr 4, 2024
Crawleo has been reliable for tool-calling workflows; the MCP profile page is a good permalink for internal docs.
- ★★★★★Rahul Santra· Mar 3, 2024
According to our notes, Crawleo benefits from clear Model Context Protocol framing — fewer ambiguous “AI plugin” claims.
- ★★★★★Pratham Ware· Feb 2, 2024
We wired Crawleo into a staging workspace; the listing’s GitHub and npm pointers saved time versus hunting across READMEs.
- ★★★★★Yash Thakker· Jan 1, 2024
Crawleo is a well-scoped MCP server in the explainx.ai directory — install snippets and categories matched our Claude Code setup.