@XGenerationLab/xiyan_mcp_server: XiYan MCP Server
XiYan MCP Server is a Model Context Protocol server that enables natural language queries to databases, leveraging the state-of-the-art XiYan-SQL model for text-to-SQL conversion. It supports MySQL databases and provides intuitive data retrieval through natural language interfaces.
Author
XGenerationLab
README
XiYan MCP Server
A Model Context Protocol (MCP) server that enables natural language queries to databases
powered by XiYan-SQL, SOTA of text-to-sql on open benchmarks
π» XiYan-mcp-server |
π XiYan-SQL |
π Arxiv |
π PapersWithCode
English | δΈζ
Table of Contents
Features
- π Fetch data by natural language through XiYanSQL
- π±οΈ List available MySQL tables as resources
- π§ Read table contents
Tool Preview
-
The tool
get_data_via_natural_language
provides a natural language interface for retrieving data from a database. This server will convert the input natural language into SQL using a built-in model and call the database to return the query results. -
The
mysql://{table_name}
resource allows obtaining a portion of sample data from the database for model reference when a specific table_name is specified. -
The
mysql://
resource will list the names of the current databases
Installation
Installing from pip
Python 3.11+ is required. you can install the server through pip, and it will install the latest verion
bashpip install xiyan-mcp-server
After that you can directly run the server by:
bashpython -m xiyan_mcp_server
But it does not provide any functions until you complete following config. You will get a yml file. After that you can run the server by:
yamlenv YML=path/to/yml python -m xiyan_mcp_server
Installing from Smithery.ai
See @XGenerationLab/xiyan_mcp_server
Not fully tested.
Configuration
You need a yml config file to configure the server. a default config file is provided in config_demo.yml which looks like this:
yamlmodel: name: "pre-xiyan_multi_dialect_v3" key: "" url: "https://poc-dashscope.aliyuncs.com/compatible-mode/v1" database: host: "localhost" port: 3306 user: "root" password: "" database: ""
LLM Configuration
Name
is the name of the model to use, key
is the API key of the model, url
is the API url of the model. We support following models.
general LLMs
if you want to use the general LLMs, e.g. gpt3.5, you can directly config like this:
yamlmodel: name: "gpt-3.5-turbo" key: "YOUR KEY " url: "https://api.openai.com/v1/chat/completions" database:
if you want to use Qwen from alibaba, e.g. Qwen-max,
yamlmodel: name: "qwen-max" key: "YOUR KEY " url: "https://dashscope.aliyuncs.com/compatible-mode/v1" database:
Text-to-SQL SOTA model
Last, we recommend the XiYanSQL-qwencoder-32B (https://github.com/XGenerationLab/XiYanSQL-QwenCoder), which is the SOTA model in text-to-sql.
We deployed the model on Alibaba Cloud DashScope, so you need to set the following environment variables:
Contact us to get the key
. ( godot.lzl@alibaba-inc.com )
yamlmodel: name: "pre-xiyan_multi_dialect_v3" key: "KEY" url: "https://poc-dashscope.aliyuncs.com/compatible-mode/v1" database:
Alternatively, you can also deploy the model (XiYanSQL-qwencoder-32B) on your own server.
Local LLMs
To support in the future.
Database Configuration
host
, port
, user
, password
, database
are the connection information of the MySQL database.
You can use local or any remote databases. Now we support MySQL (more dialects soon).
yamldatabase: host: "localhost" port: 3306 user: "root" password: "" database: ""
Launch
Claude desktop
Add this in your claude desktop config file
json{ "mcpServers": { "xiyan-mcp-server": { "command": "python", "args": [ "-m", "xiyan_mcp_server" ], "env": { "YML": "PATH/TO/YML" } } } }
Citation
If you find our work helpful, feel free to give us a cite.
bib@article{xiyansql, title={A Preview of XiYan-SQL: A Multi-Generator Ensemble Framework for Text-to-SQL}, author={Yingqi Gao and Yifu Liu and Xiaoxia Li and Xiaorong Shi and Yin Zhu and Yiming Wang and Shiqi Li and Wei Li and Yuntao Hong and Zhiling Luo and Jinyang Gao and Liyu Mou and Yu Li}, year={2024}, journal={arXiv preprint arXiv:2411.08599}, url={https://arxiv.org/abs/2411.08599}, primaryClass={cs.AI} }