zin-mcp-client

zin-mcp-client

63 Stars

MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well

zinja-coder
Jun 30, 2025
63 stars
Category
Mcp-server
GitHub Stars
63
Project Added On
Jun 30, 2025
Contributors
1
# ZIN-MCP-CLIENT (Part of Zin's MCP Suite) ⚡ Lightweight, Fast, Simple, Cross Platform, CLI, Web UI and Open Web UI based MCP Client for STDIO MCP Servers, to fill the gap and provide bridge between your local LLMs running Ollama and MCP Servers. ![GitHub contributors JADX-AI-MCP](https://img.shields.io/github/contributors/zinja-coder/zin-mcp-client) ![GitHub contributors JADX-MCP-SERVER](https://img.shields.io/github/contributors/zinja-coder/zin-mcp-client) ![GitHub all releases](https://img.shields.io/github/downloads/zinja-coder/zin-mcp-client/total) ![GitHub release (latest by SemVer)](https://img.shields.io/github/downloads/zinja-coder/zin-mcp-client/latest/total) ![Latest release](https://img.shields.io/github/release/zinja-coder/zin-mcp-client.svg) ![Python 3.10+](https://img.shields.io/badge/python-3%2E10%2B-blue) [![License](http://img.shields.io/:license-apache-blue.svg)](http://www.apache.org/licenses/LICENSE-2.0.html)
banner

Image generated using AI tools.


🤖 What is ZIN MCP Client?

A powerful yet lightweight, fast, simple and flexible client for interacting with MCP (Model Context Protocol) servers through local LLMs. This tool allows you to connect to and utilize various tools from multiple MCP servers through an easy-to-use command-line interface.

Watch the demos!

  • Rich CLI interaction

https://github.com/user-attachments/assets/fad3d994-8113-47df-b10c-54541a5c3aec

  • Perform Code Review to Find Vulnerabilities locally

https://github.com/user-attachments/assets/4cd26715-b5e6-4b4b-95e4-054de6789f42

  • Plug n Play support for Open Web UI

https://github.com/user-attachments/assets/94e1ff33-0c88-40a8-8447-d9c3278a1d50

https://github.com/user-attachments/assets/759f7138-1cc5-400f-b7e2-a6a7f0dca894

  • Light Weight Web UI

https://github.com/user-attachments/assets/704c214b-0ebb-4da9-971c-5f04446e9646

Features

  • Multi-Server Support: Connect to multiple MCP servers simultaneously
  • Local LLM Integration: Use local LLMs via Ollama for privacy and control
  • Interactive CLI: Clean, intuitive command-line interface with rich formatting
  • Minimal Light Weight Web UI: Clean, minimal, lightweight web ui for ease of use
  • Open Web UI Integration: Plug n Play integration with Open Web UI to get rich features of Open Web UI with strong MCP Client
  • Comprehensive Logging: Detailed logs for debugging and troubleshooting
  • ReAct Agent Framework: Utilizes LangChain’s ReAct agent pattern to intelligently invoke tools
  • Cross Platform: Cross Platform Support
  • Simple, Fast, Lightweight: It is simple, it is fast, it is lightweight

Check the Zin MCP Suite

[!NOTE]

This project is in active development. Expect breaking changes with releases. Review the release changelog before updating.
|:--------------------------------|
This project is primarily built to used with local llm for personal small scale use only. Exposing this on network may pose security risk and thus not encouraged.
|:--------------------------------|
This project is mainly developed for Zin MCP Servers which are below mentioned MCP Servers, But support for other MCP Servers is there and testing is also done on other MCP Servers such as Ghidra

🛠️ Getting Started

1. Downlaod from Releases: https://github.com/zinja-coder/zin-mcp-client/releases

# 1. 
unzip zin-mcp-client-<version>.zip

├zin-mcp-client/
  ├── zin_mcp_client.py
  ├── web_client.py
  ├── mcp_proxy.py
  ├── static/
  ├── src/
  ├── mcp-config.json
  ├── README.md
  ├── LICENSE

# 2. Navigate to zin-mcp-client directory
cd zin-mcp-client

# 3. This project uses uv (recommended) - https://github.com/astral-sh/uv instead of pip for dependency management.
    ## a. Install uv (if you dont have it yet) - (Only Required Step)
curl -LsSf https://astral.sh/uv/install.sh | sh

# All below steps are not required.
    ## b. OPTIONAL, if for any reasons, you get dependecy errors in jadx-mcp-server, Set up the environment
uv venv
source .venv/bin/activate  # or .venv\Scripts\activate on Windows
    ## c. OPTIONAL Install dependencies
uv pip install -r requirements.txt

# 4. Not recommended, you can also use pip for this.
pip install -r requirements.txt
or
pip install -r requirements.txt --break-system-packages

# The setup for zin-mcp-client is done.

🤖 2. Ollama Setup

  ollama
1. Download and Install ollama: https://ollama.com/download

If you are on linux you can directly run below command to install it:

> curl -fsSL https://ollama.com/install.sh | sh

2. Download and run any LLM that has capability to invoke tool.

For example, the llama 3.1 has capability to invoke the tool. 

You can run it using following command:

> ollama run llama3.1:8b

[Note]: Kindly note the above command will fetch the model with 4b parameters. If you have stronger hardware kindly fetch higher parameter model for better performance.

3. Serve the Ollama on API server using following command

> ollama serve

This will serve the ollama api on port 1134, you can confirm that it running using `curl` command as following:

> curl http://localhost:11434/                                                                                                                                              18:54:00
`Ollama is running`

⚙️ 3. Config File Setup

The config file is the MCP Server configuration file that tells zin mcp client how to start the MCP Servers.

It follows the same style as Claude’s configuration file.

Below is the sample configuration file for Zin MCP Suite Servers:

{
    "mcpServers": {
        "jadx-mcp-server": {
            "command": "/path/to/uv", 
            "args": [
                "--directory",
                "/path/to/jadx-mcp-server/",
                "run",
                "jadx_mcp_server.py"
            ]
        },
        "apktool-mcp-server": {
            "command": "/path/to/uv",
            "args": [
                "--directory",
                "/path/to/apktool-mcp-server/",
                "run",
                "apktool_mcp_server.py"
            ]
    }
    }
}

Replace:

  • path/to/uv with the actual path to your uv executable
  • path/to/jadx-mcp-server with the absolute path to where you have stored the jadx-mcp-server

[!NOTE]

The default location of config file is inside zin-mcp-client directory named mcp-config.json, however you can provide path to your own config file using --config option such as

uv run zin_mcp_client.py --server jadx-mcp-server --model llama3.1:8b --config /home/zinjacoder/mcp-config.json 

Give it a shot

  1. Run zin_mcp_client.py
uv run zin_mcp_client.py
  1. Use --server option to specify server of your choice, use --config option to provide path to your config file, use --model option to use specific model, use --debug to enable verbose output

If something went wrong - DEBUG and Troubleshooting

[!NOTE]

For low spec systems, use only one server at a time to avoid LLM hallucinations.

  1. Look the logs:
  • All the logs and debug information and raw traffic and interactions are stored in logs in easy to read way, If anything goes wrong check the logs.
  1. Debug Mode:
  • You can also use the --debug flag to enable debug to print each and every detail on console on runtime to help you find the issue.

https://github.com/user-attachments/assets/ee478917-c4f5-46fb-9f0e-ad31d7c33ee0

  1. Open Issue:
  • If you can not resolve the error on you own, use the logs and debug mode’s output and provide them to us by opening an issue at https://github.com/zinja-coder/zin-mcp-client/issues

Web Client

  1. Done configuration mentioned above
  2. Run web client using following:
uv run web_client.py
  1. Demo of the web client is shown in video at top

Setting up with Open Web UI

openwebuilogo

  1. Done configuration mentioned above.
  2. Run the MCP Proxy using following:
uv run mcp_proxy.py
  1. Go to Open Web Ui portal.
  2. In Open Web UI -> Settings -> Connection -> add new connection as shown in below image

image

Add the URL for MCP Proxy running on port 8000 with localhost if running locally as shown in image. You can put anything in API key.

  1. Now prompt and utilize the MCP client, Plug n Play setup.

image


Current State Of Local LLM and MCPs:

Currently, proprietary API-based models like Anthropic’s Claude tend to be more proficient proficient at tool calling.

However, the open-source world is advancing rapidly! Models specifically fine-tuned on function calling datasets are becoming increasingly available through Ollama. Researching models tagged with function calling or tool use on platforms like Hugging Face or [checking discussions on communities like r/LocalLLaMA is key to finding capable local opt

... Content truncated. Click "See More" to view the full README.

Tool Information

Author

zinja-coder

Project Added On

June 30, 2025

License

Open Source

Tags

ai cli-app cybersecurity-tools ethical-hacking-tools fastmcp llm mcp mcp-client model-context-protocol model-context-protocol-client python rich-console vapt