Demo β’ QuickStart β’ Alerts And Anomalies β’ Knowledge And Tools β’ FAQ β’ Community β’ Contributors
π¦Ύ Build your personal database administrator (D-Bot)π§βπ», which is good at reading documents, using various tools, writing analysis reports!
In the online website (http://dbgpt.dbmind.cn), you can browse all the historical diagnosis results, used metrics, and the detailed diagnosis processes.
-
Upgrade the LLM-based diagnosis mechanism:
- Task Dispatching -> Concurrent Diagnosis -> Cross Review -> Report Generation (downloadable)
-
Add typical anomalies and alerts (Pigsty) π link
-
An end-to-end framework is available! π link
-
Support monitoring and optimization tools in multiple levels π link
- Monitoring metrics (Prometheus)
- Diagnosis knowledge retrieval (dbmind)
- Logical query transformations (Calcite)
- Index optimization algorithms (for PostgreSQL)
- Physical operator hints (for PostgreSQL)
- Backup and Point-in-time Recovery (Pigsty)
-
Our vision papers are released (continuously update)
-
LLM As DBA. [paper] [δΈζ解读] [twitter] [slides]
-
DB-GPT: Large Language Model Meets Database. [paper]
-
This project is evolving with new features π«π«
Don't forget to star β and watch π to stay up to date :)
.
βββ multiagents
β βββ agent_conf # Settings of each agent
β βββ agents # Implementation of different agent types
β βββ environments # E.g., chat orders / chat update / terminal conditions
β βββ knowledge # Diagnosis experience from documents
β βββ llms # Supported models
β βββ memory # The content and summary of chat history
β βββ reasoning_algorithms # The available algorithm for single LLM reasoning
β βββ response_formalize_scripts # Useless content removal of model response
β βββ tools # External monitoring/optimization tools for models
β βββ utils # Other functions (e.g., database/json/yaml operations)
βββ web_service # Web services to view diagnostic reports
β βββ backend # Web services backend
β βββ frontend # Web services frontend
βββ webhook # Use the webhook to save the results of the alert to a file
We provide a local website to browse historical diagnosis reports and procedures. You can easily launch it with
- install dependencies for the first runγ
# install frontend environment
cd web_service/frontend
rm -rf node_modules/
rm -r package-lock.json
# install dependencies for the first run (nodejs, ^16.13.1 is recommended)
npm install --legacy-peer-deps
- run
# cd service directory
cd web_service
# launch the local server and open the website
sh run_service.sh
Modify the "python app.py" command within run_demo.sh if multiple versions of Python are installed.
After successfully launching the local server, visit http://127.0.0.1:8025/ to browse the diagnosis reports.
-
PostgreSQL v12 or higher
Make sure your database supports remote connection (link)
Additionally, install extensions like pg_stat_statements (track frequent queries), pg_hint_plan (optimize physical operators), and hypopg (create hypothetical Indexes).
Note pg_stat_statements continuosly accumulate query statistics over time. So you need to clear the statistics from time to time: 1) To discard all the statistics, execute "SELECT pg_stat_statements_reset();"; 2) To discard the statistics of specific query, execute "SELECT pg_stat_statements_reset(userid, dbid, queryid);".
-
Enable slow query log in PostgreSQL (link)
(1) For "systemctl restart postgresql", the service name can be different (e.g., postgresql-12.service);
(2) Use absolute log path name like "log_directory = '/var/lib/pgsql/12/data/log'";
(3) Set "log_line_prefix = '%m [%p] [%d]'" in postgresql.conf (to record the database names of different queries).
-
Prometheus
Check prometheus.md for detailed installation guides.
Step 1: Install python packages.
pip install -r requirements.txt
Step 2: Configure environment variables.
- Export your OpenAI API key
# macos
export OPENAI_API_KEY="your_api_key_here"
# windows
set OPENAI_API_KEY="your_api_key_here"
Step 3: Add database/anomaly/prometheus settings into tool_config_example.yaml and rename into tool_config.yaml:
```bash
POSTGRESQL:
host: 182.92.xxx.x
port: 5432
user: xxxx
password: xxxxx
dbname: postgres
DATABASESERVER:
server_address: 182.92.xxx.x
username: root
password: xxxxx
remote_directory: /var/lib/pgsql/12/data/log
PROMETHEUS:
api_url: http://8.131.xxx.xx:9090/
postgresql_exporter_instance: 172.27.xx.xx:9187
node_exporter_instance: 172.27.xx.xx:9100
```
remote_directory in the DATABASESERVER setting indicates where the slow query log file is located at (link).
- If accessing openai service via vpn, execute this command:
# macos
export https_proxy=http://127.0.0.1:7890 http_proxy=http://127.0.0.1:7890 all_proxy=socks5://127.0.0.1:7890
- Test your openai key
cd others
python openai_test.py
- Test single case
python main.py
- Test in batch
python batch_main.py
We support AlertManager for Prometheus. You can find more information about how to configure alertmanager here: alertmanager.md.
- We provide AlertManager-related configuration files, including alertmanager.yml, node_rules.yml, and pgsql_rules.yml. The path is in the config folder in the root directory, which you can deploy to your Prometheus server to retrieve the associated exceptions.
- We also provide webhook server that supports getting alerts. The path is a webhook folder in the root directory that you can deploy to your server to get and store Prometheus's alerts. The diagnostic model periodically grabs Alert information from this server. This file is obtained using SSh. You need to configure your server information in the tool_config.yaml in the config folder.
- node_rules.yml and pgsql_rules.yml is a reference https://github.com/Vonng/pigsty code in this open source project, their monitoring do very well, thank them for their effort.
We offer scripts that could incur typical anomalies. Check out different anomaly cases in http://dbgpt.dbmind.cn
Click to check 29 typical anomalies together with expert analysis (supported by the DBMind team)
Step 1. Rename doc2knowledge/config_template.json into doc2knowledge/config.json. And add the value for "api_key" ("organization" is optional)
GPT-4 is necessary to utilize the function calling feature. I will try to solve this limit.
Step 2. Split documents into separated section files by the section indexes (e.g., section1, section1.1, section2 ...). And copy the section files into the docs/<your_document_name>/raw/. For example:
.
βββ docs
β βββ report_example
| β βββ raw
| β | βββ 1 title.txt
| β | βββ 1.1 category.txt
It is a laborious work and hard to find a better way than manually splitting the given document
You can jump over this step and directly run the report_example case
Step 3. Modify the arguments in doc2knowledge.py script and run the script:
python doc2knowledge.py
The summary for the same document sections is cached. You can delete this file if do not like to reuse the previous caches.
-
Tool APIs (for optimization)
Module Functions index_selection (equipped) heuristic algorithm query_rewrite (equipped) 45 rules physical_hint (equipped) 15 parameters For functions within [query_rewrite, physical_hint], you can use api_test.py script to verify the effectiveness.
If the function actually works, append it to the api.py of corresponding module.
π€¨ The '.sh' script command cannot be executed on windows system.
Switch the shell to *git bash* or use *git bash* to execute the '.sh' script.π€¨ "No module named 'xxx'" on windows system.
This error is caused by issues with the Python runtime environment path. You need to perform the following steps:Step 1: Check Environment Variables.
You must configure the "Scripts" in the environment variables.
Step 2: Check IDE Settings.
For VS Code, download the Python extension for code. For PyCharm, specify the Python version for the current project.
Project cleaningSupport more anomalies- Strictly constrain the llm outputs (excessive irrelevant information) based on the matched knowledge
Query log option (potential to take up disk space and we need to consider it carefully)Add more communication mechanismsSupport more knowledge sources- Localized model that reaches D-bot(gpt4)'s capability
- Support other databases (e.g., mysql/redis)
https://github.com/OpenBMB/AgentVerse
https://github.com/Vonng/pigsty
Feel free to cite us if you like this project.
@misc{zhou2023llm4diag,
title={LLM As DBA},
author={Xuanhe Zhou, Guoliang Li, Zhiyuan Liu},
year={2023},
eprint={2308.05481},
archivePrefix={arXiv},
primaryClass={cs.DB}
}
@misc{zhou2023dbgpt,
title={DB-GPT: Large Language Model Meets Database},
author={Xuanhe Zhou, Zhaoyan Sun, Guoliang Li},
year={2023},
archivePrefix={Data Science and Engineering},
}
Other Collaborators: Wei Zhou, Kunyi Li.
We thank all the contributors to this project. Do not hesitate if you would like to get involved or contribute!