🌐️ English | 中文
Traditional tabular data processing technologies, such as distributed databases, data warehouses, and data lakes, have been continuously evolving and gradually maturing. In comparison, graph-based data processing technologies (graph databases, graph computing engines) provide new ideas and methods, but also face issues such as low ecological maturity and high barriers to product usage. With the rise of large language models (LLMs), how to effectively combine artificial intelligence technology with graph computing technology (Graph + AI) will be a very worthwhile direction to explore. On one hand, we can leverage cutting-edge technologies like LLMs and agents to lower the barriers to using graph computing products and enhance the user experience with graphs. On the other hand, graph computing technology can fully utilize the performance and interpretability advantages of graph data structures in relational analysis scenarios, assisting LLMs and intelligent agents in improving reasoning capabilities and generation quality.
Chat2Graph builds a multi-agent system (MAS) on top of a graph database to achieve intelligent capabilities in research and development, operations and maintenance, Q&A, generation, and more, helping users, developers, product managers, solution architects, operations engineers, and others to efficiently use graph databases, lower the barriers to using graphs, and accelerate content generation, implement dialogue with graphs. At the same time, leveraging the inherent advantages of graph data structures in relationship modeling, interpretability, etc., can enhance the key capabilities of intelligent agents such as reasoning, planning, memory, and tools, to achieve a deep integration of graph computing technology and artificial intelligence technology.
Chat2Graph.mp4
Chat2Graph currently provides basic capabilities of intelligent agent systems, but there are still many features that need to be improved together with the community.
- Reasoning && Planning
- One-Active-Many-Passive hybrid multi-agent architecture.
- Dual-LLM reasoning machine combining fast & slow thinking.
- Chain of agents (CoA) oriented task decomposition and graph planner.
- Workflow auto-generation.
- Action recommendation in operator.
- Structured agent role management.
- Agent task compiler.
- Memory && Knowledge
- Hierarchical memory system.
- Vector and graph knowledge base.
- Knowledge refinement mechanism.
- Environment management.
- Tool && System
- Toolkit knowledge graph.
- Toolkit graph optimizer.
- Rich toolkit/MCP integration.
- Unified resource manager.
- Tracing and control capabilities.
- Benchmark.
- Product && Ecosystem
- Concise intelligent agent SDK.
- Web Service and interaction.
- One-click configuration of agents.
- Multimodal capabilities.
- Production enhancement.
- Integration with open-source ecosystems.
Prepare the required versions of Python and NodeJS.
- Install Python: Python == 3.10 recommended.
- Install NodeJS: NodeJS >= v16 recommended.
You can also use tools like conda to install the python environment.
Build Chat2Graph as follows.
git clone https://github.com/TuGraph-family/chat2graph.git
cd chat2graph
./bin/build.sh
Then configure environment variables (e.g., LLM parameters) based on .env.template, startup Chat2Graph.
cp .env.template .env && vim .env
./bin/start.sh
When you see the following log:
Starting server...
Web resources location: /Users/florian/code/chat2graph/app/server/web
System database url: sqlite:////Users/florian/.chat2graph/system/chat2graph.db
Loading AgenticService from app/core/sdk/chat2graph.yml with encoding utf-8
Init application: Chat2Graph
Init the Leader agent
Init the Expert agents
____ _ _ ____ ____ _
/ ___| |__ __ _| |_|___ \ / ___|_ __ __ _ _ __ | |__
| | | '_ \ / _` | __| __) | | _| '__/ _` | '_ \| '_ \
| |___| | | | (_| | |_ / __/| |_| | | | (_| | |_) | | | |
\____|_| |_|\__,_|\__|_____|\____|_| \__,_| .__/|_| |_|
|_|
* Serving Flask app 'bootstrap'
* Debug mode: off
WARNING: This is a development server. Do not use it in a production deployment. Use a production WSGI server instead.
* Running on all addresses (0.0.0.0)
* Running on http://127.0.0.1:5010
* Running on http://192.168.1.1:5010
Chat2Graph server started success ! (pid: 16483)
You can access Chat2Graph in the browser at http://localhost:5010/:
After registering the graph database to Chat2Graph in "Backend Manager", you can experience the complete ability
of "chat to graph".
The graph databases currently supported by Docker are:
- Neo4j
docker pull neo4j:latest
docker run -d -p 7474:7474 -p 7687:7687 --name neo4j-server --env NEO4J_AUTH=none \
--env NEO4J_PLUGINS='["apoc", "graph-data-science"]' neo4j:latest
- TuGraph-DB
Note: we will support TuGraph-DB connectivity in the future.
docker pull tugraph/tugraph-runtime-centos7:4.5.1
docker run -d -p 7070:7070 -p 7687:7687 -p 9090:9090 --name tugraph-server \
tugraph/tugraph-runtime-centos7:latest lgraph_server -d run --enable_plugin true
The SDK of Chat2Graph provides a very clear and concise API, allowing you to easily build access to your intelligent system.
You can quickly converse with the built-in Chat2Graph in the following ways.
SystemEnv.LLM_NAME="gemini-2.0-flash" # or gemini-2.5-flash-preview-04-17 recommended
SystemEnv.LLM_ENDPOINT="https://generativelanguage.googleapis.com/v1beta/openai/"
SystemEnv.LLM_APIKEY="<YOUR-GEMINI-API-KEY>"
mas = AgenticService.load()
question = TextMessage(payload = "What is TuGraph ?")
answer = mas.execute(question).get_payload()
At the same time, the SDK also provides asynchronous dialogue capabilities.
job = mas.session().submit(question)
answer = job.wait().get_payload()
Of course, customizing your own intelligent agent is also allowed.
mas = AgenticService("Chat2Graph")
mas.expert(name="Design Expert").workflow(
(analysis_operator, concept_modeling_operator)
).build()
To facilitate the rapid configuration of agents, you can describe the details of the agent using a YAML file and load it directly.
mas = AgenticService.load("app/core/sdk/chat2graph.yml")
For details, please refer to User Manual.
You can reference Contributing document and submit GitHub Issues/PRs to provide feedback and suggest improvements for Chat2Graph.
TuGraph establishes a clear Architecture and Roles for the community, and will invite outstanding contributors to join SIGs.
You can contact with us directly through TuGraph Discord and WeChat group provided below.
- Discord:https://discord.gg/KBCFbNFj
- WeChat: