ai-mldeveloper-tools

LlamaIndex

by sammcj

LlamaIndex integrates LlamaIndexTS to deliver AI question answer and code generation with top LLM providers for document

Integrates with LlamaIndexTS to provide access to various LLM providers for code generation, documentation writing, and question answering tasks

github stars

77

Access to multiple LLM providersDirect file writing capabilitiesBuilt on LlamaIndexTS

best for

  • / Developers needing AI code generation without switching tools
  • / Teams automating documentation creation
  • / Code review and explanation workflows

capabilities

  • / Generate code based on natural language descriptions
  • / Write code directly to specific files and line numbers
  • / Generate documentation for existing code
  • / Ask questions to various LLM providers

what it does

Provides access to multiple LLM providers through LlamaIndexTS for generating code, writing documentation, and answering questions directly from your MCP client.

about

LlamaIndex is a community-built MCP server published by sammcj that provides AI assistants with tools and capabilities via the Model Context Protocol. LlamaIndex integrates LlamaIndexTS to deliver AI question answer and code generation with top LLM providers for document It is categorized under ai ml, developer tools.

how to install

You can install LlamaIndex in your AI client of choice. Use the install panel on this page to get one-click setup for Cursor, Claude Desktop, VS Code, and other MCP-compatible clients. This server runs locally on your machine via the stdio transport.

license

MIT

LlamaIndex is released under the MIT license. This is a permissive open-source license, meaning you can freely use, modify, and distribute the software.

readme

MCP LLM

smithery badge

An MCP server that provides access to LLMs using the LlamaIndexTS library.

I put some LLMs in your MCP for your LLMs

<a href="https://glama.ai/mcp/servers/i1gantlfrs"> <img width="380" height="200" src="https://glama.ai/mcp/servers/i1gantlfrs/badge" alt="mcp-llm MCP server" /> </a>

Features

This MCP server provides the following tools:

  • generate_code: Generate code based on a description
  • generate_code_to_file: Generate code and write it directly to a file at a specific line number
  • generate_documentation: Generate documentation for code
  • ask_question: Ask a question to the LLM

call a llm to generate code call a reasoning llm to write some documentation

Installation

Installing via Smithery

To install LLM Server for Claude Desktop automatically via Smithery:

npx -y @smithery/cli install @sammcj/mcp-llm --client claude

Manual Install From Source

  1. Clone the repository
  2. Install dependencies:
npm install
  1. Build the project:
npm run build
  1. Update your MCP configuration

Using the Example Script

The repository includes an example script that demonstrates how to use the MCP server programmatically:

node examples/use-mcp-server.js

This script starts the MCP server and sends requests to it using curl commands.

Examples

Generate Code

{
  "description": "Create a function that calculates the factorial of a number",
  "language": "JavaScript"
}

Generate Code to File

{
  "description": "Create a function that calculates the factorial of a number",
  "language": "JavaScript",
  "filePath": "/path/to/factorial.js",
  "lineNumber": 10,
  "replaceLines": 0
}

The generate_code_to_file tool supports both relative and absolute file paths. If a relative path is provided, it will be resolved relative to the current working directory of the MCP server.

Generate Documentation

{
  "code": "function factorial(n) {
  if (n <= 1) return 1;
  return n * factorial(n - 1);
}",
  "language": "JavaScript",
  "format": "JSDoc"
}

Ask Question

{
  "question": "What is the difference between var, let, and const in JavaScript?",
  "context": "I'm a beginner learning JavaScript and confused about variable declarations."
}

License