Tuesday, May 13, 2025

Implementing an LLM Agent with Instrument Entry Utilizing MCP-Use

MCP-Use is an open-source library that allows you to join any LLM to any MCP server, giving your brokers device entry like internet shopping, file operations, and extra — all with out counting on closed-source shoppers. On this tutorial, we’ll use langchain-groq and MCP-Use’s built-in dialog reminiscence to construct a easy chatbot that may work together with instruments by way of MCP.

Putting in uv bundle supervisor

We’ll first arrange our surroundings and begin with putting in the uv bundle supervisor. For Mac or Linux:

curl -LsSf https://astral.sh/uv/set up.sh | sh 

For Home windows (PowerShell):

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/set up.ps1 | iex"

Creating a brand new listing and activating a digital atmosphere

We’ll then create a brand new venture listing and initialize it with uv

uv init mcp-use-demo
cd mcp-use-demo

We are able to now create and activate a digital atmosphere. For Mac or Linux:

uv venv
supply .venv/bin/activate

For Home windows:

uv venv
.venvScriptsactivate

Putting in Python dependencies

We’ll now set up the required dependencies

uv add mcp-use langchain-groq python-dotenv

Groq API Key

To make use of Groq’s LLMs:

  1. Go to Groq Console and generate an API key.
  2. Create a .env file in your venture listing and add the next line:

Change with the important thing you simply generated.

Courageous Search API Key

This tutorial makes use of the Courageous Search MCP Server.

  1. Get your Courageous Search API key from: Courageous Search API
  2. Create a file named mcp.json within the venture root with the next content material:
{
  "mcpServers": {
    "brave-search": {
      "command": "npx",
      "args": (
        "-y",
        "@modelcontextprotocol/server-brave-search"
      ),
      "env": {
        "BRAVE_API_KEY": ""
      }
    }
  }
}

Change along with your precise Courageous API key.

Node JS

Some MCP servers (together with Courageous Search) require npx, which comes with Node.js.

  • Obtain the most recent model of Node.js from nodejs.org
  • Run the installer.
  • Go away all settings as default and full the set up

Utilizing different servers

In case you’d like to make use of a unique MCP server, merely change the contents of mcp.json with the configuration for that server.

Create an app.py file within the listing and add the next content material:

Importing the libraries

from dotenv import load_dotenv
from langchain_groq import ChatGroq
from mcp_use import MCPAgent, MCPClient
import os
import sys
import warnings

warnings.filterwarnings("ignore", class=ResourceWarning)

This part masses atmosphere variables and imports required modules for LangChain, MCP-Use, and Groq. It additionally suppresses ResourceWarning for cleaner output.

Organising the chatbot

async def run_chatbot():
    """ Operating a chat utilizing MCPAgent's in-built dialog reminiscence """
    load_dotenv()
    os.environ("GROQ_API_KEY") = os.getenv("GROQ_API_KEY")

    configFile = "mcp.json"
    print("Beginning chatbot...")

    # Creating MCP consumer and LLM occasion
    consumer = MCPClient.from_config_file(configFile)
    llm = ChatGroq(mannequin="llama-3.1-8b-instant")

    # Creating an agent with reminiscence enabled
    agent = MCPAgent(
        llm=llm,
        consumer=consumer,
        max_steps=15,
        memory_enabled=True,
        verbose=False
    )

This part masses the Groq API key from the .env file and initializes the MCP consumer utilizing the configuration offered in mcp.json. It then units up the LangChain Groq LLM and creates a memory-enabled agent to deal with conversations.

Implementing the chatbot

# Add this within the run_chatbot operate
    print("n-----Interactive MCP Chat----")
    print("Kind 'exit' or 'stop' to finish the dialog")
    print("Kind 'clear' to clear dialog historical past")

    attempt:
        whereas True:
            user_input = enter("nYou: ")

            if user_input.decrease() in ("exit", "stop"):
                print("Ending dialog....")
                break
           
            if user_input.decrease() == "clear":
                agent.clear_conversation_history()
                print("Dialog historical past cleared....")
                proceed
           
            print("nAssistant: ", finish="", flush=True)

            attempt:
                response = await agent.run(user_input)
                print(response)
           
            besides Exception as e:
                print(f"nError: {e}")

    lastly:
        if consumer and consumer.classes:
            await consumer.close_all_sessions()

This part permits interactive chatting, permitting the person to enter queries and obtain responses from the assistant. It additionally helps clearing the chat historical past when requested. The assistant’s responses are displayed in real-time, and the code ensures that every one MCP classes are closed cleanly when the dialog ends or is interrupted.

Operating the app

if __name__ == "__main__":
    import asyncio
    attempt:
        asyncio.run(run_chatbot())
    besides KeyboardInterrupt:
        print("Session interrupted. Goodbye!")
   
    lastly:
        sys.stderr = open(os.devnull, "w")

This part runs the asynchronous chatbot loop, managing steady interplay with the person. It additionally handles keyboard interruptions gracefully, making certain this system exits with out errors when the person terminates the session.

You will discover the whole code right here

To run the app, run the next command

This can begin the app, and you may work together with the chatbot and use the server for the session


I’m a Civil Engineering Graduate (2022) from Jamia Millia Islamia, New Delhi, and I’ve a eager curiosity in Information Science, particularly Neural Networks and their utility in varied areas.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles