search-webai-ml

Deep Research MCP

by u14app

Use any LLM for deep research. Performs multi-step web search, content analysis, and synthesis for comprehensive researc

Use any LLM for deep research. Performs multi-step web search, content analysis, and synthesis for comprehensive research reports. Supports SSE API and MCP server. 4,500+ GitHub stars.

github stars

4.5K

Local data processing for privacy4,500+ GitHub starsWorks with any LLM

best for

  • / Researchers needing comprehensive topic analysis
  • / Content creators gathering background information
  • / Students working on research projects
  • / Analysts preparing market or competitive intelligence

capabilities

  • / Generate comprehensive research reports from web searches
  • / Perform multi-step content analysis and synthesis
  • / Use multiple AI models for research tasks
  • / Search and analyze web content automatically
  • / Create detailed reports with citations and sources

what it does

Performs multi-step web searches and uses AI models to generate comprehensive research reports in minutes. Processes and stores all data locally for privacy.

about

Deep Research MCP is a community-built MCP server published by u14app that provides AI assistants with tools and capabilities via the Model Context Protocol. Use any LLM for deep research. Performs multi-step web search, content analysis, and synthesis for comprehensive researc It is categorized under search web, ai ml.

how to install

You can install Deep Research MCP in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

Deep Research MCP is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

Deep Research

![GitHub deployments](https://img.shields.io/github/deployments/u14app/gemini-next-chat/Production) ![GitHub Release](https://img.shields.io/github/v/release/u14app/deep-research) ![Docker Image Size](https://img.shields.io/docker/image-size/xiangfa/deep-research/latest) ![Docker Pulls](https://img.shields.io/docker/pulls/xiangfa/deep-research) [![License: MIT](https://img.shields.io/badge/License-MIT-default.svg)](https://opensource.org/licenses/MIT) [![Gemini](https://img.shields.io/badge/Gemini-8E75B2?style=flat&logo=googlegemini&logoColor=white)](https://ai.google.dev/) [![Next](https://img.shields.io/badge/Next.js-111111?style=flat&logo=nextdotjs&logoColor=white)](https://nextjs.org/) [![Tailwind CSS](https://img.shields.io/badge/Tailwind%20CSS-06B6D4?style=flat&logo=tailwindcss&logoColor=white)](https://tailwindcss.com/) [![shadcn/ui](https://img.shields.io/badge/shadcn/ui-111111?style=flat&logo=shadcnui&logoColor=white)](https://ui.shadcn.com/) [![Vercel](https://img.shields.io/badge/Vercel-111111?style=flat&logo=vercel&logoColor=white)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fu14app%2Fdeep-research&project-name=deep-research&repository-name=deep-research) [![Cloudflare](https://img.shields.io/badge/Cloudflare-F69652?style=flat&logo=cloudflare&logoColor=white)](./docs/How-to-deploy-to-Cloudflare-Pages.md) [![PWA](https://img.shields.io/badge/PWA-blue?style=flat&logo=pwa&logoColor=white)](https://research.u14.app/) [![Ask DeepWiki](https://deepwiki.com/badge.svg)](https://deepwiki.com/u14app/deep-research)
**Lightning-Fast Deep Research Report** Deep Research uses a variety of powerful AI models to generate in-depth research reports in just a few minutes. It leverages advanced "Thinking" and "Task" models, combined with an internet connection, to provide fast and insightful analysis on a variety of topics. **Your privacy is paramount - all data is processed and stored locally.** ## ✨ Features - **Rapid Deep Research:** Generates comprehensive research reports in about 2 minutes, significantly accelerating your research process. - **Multi-platform Support:** Supports rapid deployment to Vercel, Cloudflare and other platforms. - **Powered by AI:** Utilizes the advanced AI models for accurate and insightful analysis. - **Privacy-Focused:** Your data remains private and secure, as all data is stored locally on your browser. - **Support for Multi-LLM:** Supports a variety of mainstream large language models, including Gemini, OpenAI, Anthropic, Deepseek, Grok, Mistral, Azure OpenAI, any OpenAI Compatible LLMs, OpenRouter, Ollama, etc. - **Support Web Search:** Supports search engines such as Searxng, Tavily, Firecrawl, Exa, Bocha, Brave, etc., allowing LLMs that do not support search to use the web search function more conveniently. - **Thinking & Task Models:** Employs sophisticated "Thinking" and "Task" models to balance depth and speed, ensuring high-quality results quickly. Support switching research models. - **Support Further Research:** You can refine or adjust the research content at any stage of the project and support re-research from that stage. - **Local Knowledge Base:** Supports uploading and processing text, Office, PDF and other resource files to generate local knowledge base. - **Artifact:** Supports editing of research content, with two editing modes: WYSIWYM and Markdown. It is possible to adjust the reading level, article length and full text translation. - **Knowledge Graph:** It supports one-click generation of knowledge graph, allowing you to have a systematic understanding of the report content. - **Research History:** Support preservation of research history, you can review previous research results at any time and conduct in-depth research again. - **Local & Server API Support:** Offers flexibility with both local and server-side API calling options to suit your needs. - **Support for SaaS and MCP:** You can use this project as a deep research service (SaaS) through the SSE API, or use it in other AI services through MCP service. - **Support PWA:** With Progressive Web App (PWA) technology, you can use the project like a software. - **Support Multi-Key payload:** Support Multi-Key payload to improve API response efficiency. - **Multi-language Support**: English, 简体中文, Español. - **Built with Modern Technologies:** Developed using Next.js 15 and Shadcn UI, ensuring a modern, performant, and visually appealing user experience. - **MIT Licensed:** Open-source and freely available for personal and commercial use under the MIT License. ## 🎯 Roadmap - [x] Support preservation of research history - [x] Support editing final report and search results - [x] Support for other LLM models - [x] Support file upload and local knowledge base - [x] Support SSE API and MCP server ## 🚀 Getting Started ### Use Free Gemini (recommend) 1. Get [Gemini API Key](https://aistudio.google.com/app/apikey) 2. One-click deployment of the project, you can choose to deploy to Vercel or Cloudflare [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fu14app%2Fdeep-research&project-name=deep-research&repository-name=deep-research) Currently the project supports deployment to Cloudflare, but you need to follow [How to deploy to Cloudflare Pages](./docs/How-to-deploy-to-Cloudflare-Pages.md) to do it. 3. Start using ### Use Other LLM 1. Deploy the project to Vercel or Cloudflare 2. Set the LLM API key 3. Set the LLM API base URL (optional) 4. Start using ## ⌨️ Development Follow these steps to get Deep Research up and running on your local browser. ### Prerequisites - [Node.js](https://nodejs.org/) (version 18.18.0 or later recommended) - [pnpm](https://pnpm.io/) or [npm](https://www.npmjs.com/) or [yarn](https://yarnpkg.com/) ### Installation 1. **Clone the repository:** ```bash git clone https://github.com/u14app/deep-research.git cd deep-research ``` 2. **Install dependencies:** ```bash pnpm install # or npm install or yarn install ``` 3. **Set up Environment Variables:** You need to modify the file `env.tpl` to `.env`, or create a `.env` file and write the variables to this file. ```bash # For Development cp env.tpl .env.local # For Production cp env.tpl .env ``` 4. **Run the development server:** ```bash pnpm dev # or npm run dev or yarn dev ``` Open your browser and visit [http://localhost:3000](http://localhost:3000) to access Deep Research. ### Custom Model List The project allow custom model list, but **only works in proxy mode**. Please add an environment variable named `NEXT_PUBLIC_MODEL_LIST` in the `.env` file or environment variables page. Custom model lists use `,` to separate multiple models. If you want to disable a model, use the `-` symbol followed by the model name, i.e. `-existing-model-name`. To only allow the specified model to be available, use `-all,+new-model-name`. ## 🚢 Deployment ### Vercel [![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/new/clone?repository-url=https%3A%2F%2Fgithub.com%2Fu14app%2Fdeep-research&project-name=deep-research&repository-name=deep-research) ### Cloudflare Currently the project supports deployment to Cloudflare, but you need to follow [How to deploy to Cloudflare Pages](./docs/How-to-deploy-to-Cloudflare-Pages.md) to do it. ### Docker > The Docker version needs to be 20 or above, otherwise it will prompt that the image cannot be found. > ⚠️ Note: Most of the time, the docker version will lag behind the latest version by 1 to 2 days, so the "update exists" prompt will continue to appear after deployment, which is normal. ```bash docker pull xiangfa/deep-research:latest docker run -d --name deep-research -p 3333:3000 xiangfa/deep-research ``` You can also specify additional environment variables: ```bash docker run -d --name deep-research \ -p 3333:3000 \ -e ACCESS_PASSWORD=your-password \ -e GOOGLE_GENERATIVE_AI_API_KEY=AIzaSy... \ xiangfa/deep-research ``` or build your own docker image: ```bash docker build -t deep-research . docker run -d --name deep-research -p 3333:3000 deep-research ``` If you need to specify other environment variables, please add `-e key=value` to the above command to specify it. Deploy using `docker-compose.yml`: ```bash version: '3.9' services: deep-research: image: xiangfa/deep-research container_name: deep-research environment: - ACCESS_PASSWORD=your-password - GOOGLE_GENERATIVE_AI_API_KEY=AIzaSy... ports: - 3333:3000 ``` or build your own docker compose: ```bash docker compose -f docker-compose.yml build ``` ### Static Deployment You can also build a static page version directly, and then upload all files in the `out` directory to any website service that supports static pages, such as Github Page, Cloudflare, Vercel, etc.. ```bash pnpm build:export ``` ## ⚙️ Configuration As mentioned in the "Getting Started" section, Deep Research utilizes the following environment variables for server-side API configurations: Please refer to the file [env.tpl](./env.tpl) for all available environment variables. **Important Notes on Environment Variables:** - **Privacy Reminder:** These environment variables are primarily used for **server-side API calls**. When using the **local API mode**, no API keys or server-side configurations are needed, further enhancing your privacy. - **Multi-key Support:** Supports multiple keys, each key is separated by `,`, i.e. `key1,key2,key3`. - **Security Setting:** By setting `ACCESS_PASSWORD`, you can better protect the security of the server API. - **Make variables effective:** After adding or modifying this environment variable, please redeploy the project for the changes to take effect. ## 📄 API documentation Currently the project supports two form ---