Metadata-Version: 2.4
Name: mcp-deepthinking
Version: 0.1.1
Summary: MCP server providing a 'deepthinking' tool powered by Groq LLMs for complex reasoning.
Author: jeongsk
License-Expression: MIT AND (Apache-2.0 OR BSD-2-Clause)
Keywords: deepthinking,llm,mcp
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3.11
Requires-Python: >=3.11
Requires-Dist: fastmcp>=0.4.1
Requires-Dist: langchain-groq>=0.3.2
Requires-Dist: mcp[cli]>=1.6.0
Requires-Dist: python-dotenv>=1.1.0
Description-Content-Type: text/markdown

# mcp-deepthinking

An MCP (Model Context Protocol) server that provides a **deep thinking** tool powered by Groq Large Language Models (LLMs). This server enables complex reasoning, multi-step problem solving, and advanced planning capabilities through a simple API interface.

---

## Features

- **Deep Thinking Tool**: Exposes a tool named `deepthinking` for complex reasoning tasks.
- **Groq LLM Integration**: Utilizes Groq's high-performance language models via the Groq API.
- **Configurable Models**: Supports multiple Groq models, configurable via environment variables.
- **Async & Streaming Support**: Designed for efficient, asynchronous operation.
- **Easy Integration**: Compatible with any MCP client.

---

## Requirements

- **Python** >= 3.12
- **Groq API Key** (sign up at [https://console.groq.com/](https://console.groq.com/))
- Supported Groq models:
  - `deepseek-r1-distill-llama-70b` (default)
  - `deepseek-r1-distill-qwen-32b`
  - `qwen-qwq-32b`

---

## Installation

You can install dependencies using either `uv` (recommended) or `pip`.

### Using uv (recommended)

```bash
uv pip install -r requirements.txt
# or directly
uv pip install fastmcp>=0.4.1 langchain-groq>=0.3.2 mcp[cli]>=1.6.0 python-dotenv>=1.1.0
```

### Using pip

```bash
pip install fastmcp>=0.4.1 langchain-groq>=0.3.2 mcp[cli]>=1.6.0 python-dotenv>=1.1.0
```

---

## Configuration

Create a `.env` file in the project root with the following content:

```env
GROQ_API_KEY=your_groq_api_key_here
MODEL_ID=deepseek-r1-distill-llama-70b  # optional, defaults to this model
```

- `GROQ_API_KEY` (required): Your Groq API key.
- `MODEL_ID` (optional): One of the supported model IDs.

---

## Usage

Run the MCP server:

```bash
python server.py
```

This will start the server using **stdio** transport.

### Invoking the `deepthinking` tool

From an MCP-compatible client, you can invoke the `deepthinking` tool with a query string:

```python
response = mcp.call_tool("deepthinking", {"query": "Explain the theory of relativity step by step."})
print(response)
```

The tool will return a detailed reasoning response generated by the Groq LLM.

---

## Tool Details

### `deepthinking(query: str) -> str`

A tool that helps AI perform deep thinking (reasoning) processes. It can be used for complex problem solving, planning, multi-step reasoning, and more.

- **Arguments:**
  - `query` (str): The input prompt or question.
- **Returns:**
  - `str`: The generated reasoning response, ending with `</think>`.

---

## License

MIT License

---

## Contact

For questions or support, please contact the maintainer.
