
The Model Context Protocol (MCP) is an emerging standard for structuring and managing contextual information provided to Large Language Models (LLMs), enabling them to operate more intelligently and consistently across sessions, tools, and multi-agent systems. Rather than relying solely on raw prompts, MCP introduces a structured format that includes user identity, task goals, memory state, available tools, and environmental data—allowing models to behave more like persistent, context-aware agents. By formalizing how context is encoded and shared, MCP facilitates more robust interactions, seamless tool integration, and stateful behavior in LLM-driven applications and workflows.
In this article, we’ll take you through the steps of running a sample MCP (Model Context Protocol) server built with FastMCP. The server is backed by an SQLite database and enables you to interact with it using natural human language. You'll learn how the protocol works, explore the architecture of the server, and use tools like Claude Desktop to query and operate on the database effortlessly.
Get Started Right Away
The code for the MCP server and the sample SQLite database is available in this GitHub repository. To set up the server locally, follow the steps below in your system terminal.
Clone the GitHub repository.
console$ git clone https://github.com/vultr-marketing/code-samples.git
Navigate into the
sample_sqlite_mcp
directory.console$ cd sample_sqlite_mcp
Create a
.env
file.console$ nano .env
Add the following variables in the file.
DB_PATH=sample.db MCP_PORT=8080 READ_ONLY=true
DB_PATH
: Path to your SQLite database file.MCP_PORT
: Port used by the MCP server (not used in stdio mode, but good to define).READ_ONLY
: Set to true to block insert/update/delete queries for safety.
Save and close the file.
Install dependencies.
console$ pip3 install -r requirements.txt
Understand the Sample MCP Server
The server script launches a Model Context Protocol (MCP) server using FastMCP
and connects it to a local SQLite database. It enables structured interaction with the database using natural language, typically via an LLM interface.
Key Features
- Environment-Based Configuration: Uses
.env
for settings likeDB_PATH
,MCP_PORT
, andREAD_ONLY
. - Async Database Access: Handles SQLite operations asynchronously with
aiosqlite
. - Sample DB Initialization: Creates sample
vultr_products
andvultr_product_pricing
tables if the database doesn't exist. - Query Validation: Prevents dangerous SQL operations, especially when
READ_ONLY
is enabled. - Structured Logging: Provides detailed startup and runtime logs for debugging.
Transport Layer stdio
The server uses the stdio
transport, allowing it to communicate over standard input and output. This is especially useful when integrating with desktop agents like Claude Desktop, enabling lightweight inter-process communication without a web server.
Available Tools in the Sample MCP Server
Tools are structured, callable functions that a language model can use to interact with external systems like databases, APIs, or services. Each tool has a defined input schema and a predictable output format, enabling the model to perform specific operations safely and effectively—like querying data, listing tables, or modifying records.
Tools extend the capabilities of LLMs beyond simple text generation, allowing them to reason over structured data, fetch real-time information, or take actions based on user intent.
Tool Name | Description |
---|---|
execute_query |
Executes a validated SQL SELECT query and returns the result set. |
list_tables |
Lists all user-defined tables in the SQLite database. |
describe_table |
Returns the schema (columns, types, constraints) of a specified table. |
count_rows |
Counts and returns the number of rows in a specified table. |
insert_sample_data |
Inserts predefined Vultr-related data into tables (only if not read-only). |
Use Claude Desktop to Interact
Download the Claude Desktop application, it automatically detects MCP servers defined in its configuration file and makes them accessible for use within any conversation.
Start the Claude Desktop application.
Locate the settings option in the application.
Select Developer settings.
Click on Edit Config.
Add the following configuration snippet to your Claude Desktop config file.
json{ "mcpServers": { "sqlite": { "command": "<path to uv executable>", "args": [ "--directory", "<path to directory containing server.py>", "run", "server.py" ] } } }
Once the new configuration is in place, restart the Claude desktop application.
To confirm the server is running, go to the Developer settings in Claude Desktop and check if the status next to the JSON file shows as running.
Make sure the MCP server tools are available to the application.
Ask questions and Perform Operations
You can run interactive queries by chatting naturally. Using the available tools, the LLM will translate your requests into SQL queries, execute them on the sample database, and respond in plain language.
Future Scope
Expanded Toolset: As MCP evolves, more sophisticated and diverse tools will be added to enhance the capabilities of LLMs, allowing for deeper integration with various systems, APIs, and databases.
Better Context Management: The future of MCP will likely see improvements in how context is managed across sessions, enabling more personalized and consistent interactions between LLMs and users.
Multi-Model Interoperability: MCP could facilitate seamless interaction between different AI models or even between AI and traditional software systems, opening the door for more complex workflows.
Still Evolving: MCP is an emerging technology, and as it continues to grow, we can expect to see significant improvements in both its functionality and adoption across different industries and applications.
Conclusion
In this article, we've explored how the Model Context Protocol (MCP) offers a framework for managing context and interacting with data in a more structured and intuitive way. By setting up a sample MCP server using FastMCP and SQLite, we've demonstrated how the protocol enables natural language interactions with databases through LLMs.
No comments yet.