Skip to content

Commit c429388

Browse files
committed
mcp update
1 parent 6fc5ac3 commit c429388

6 files changed

+50
-17
lines changed

docs/Integration-MCP.md

Lines changed: 50 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@
99

1010
In some cases, you may have a database, but neither the APIs nor the logic. GenAI-Logic API Logic Server can **mcp-ify existing databases** by:
1111

12-
3. Creating JSON:APIs for existing databases with a single CLI command
13-
4. Enabling you to [declare business logic](Logic.md), which can be used via the APIs in MCP executipn flows.
12+
1. Creating JSON:APIs for existing databases with a single CLI command
13+
2. Enabling you to [declare business logic](Logic.md), which can be used via the APIs in MCP executipn flows.
1414

1515

1616

@@ -96,10 +96,15 @@ Discovery uses a config file `integration/mcp/mcp_server_discovery.json` to disc
9696

9797
#### 2 - Tool Context from LLM
9898

99-
We call the LLM, providing the NL Query and the schema returned above. It returns a Tool Context completion (response), with the steps to call in the MCP Server Executor, which here is the logic-enabled API Logic Server JSON:API.
99+
We call the LLM, providing the NL Query and the discovered schema returned above. Note the schema includes:
100100

101-
![2-tool-context-from-LLM](images/integration/mcp/2-tool-context-from-LLM.png)
101+
1. resources: the schema itself
102+
* You would typically edit this file to expose only desired data, and reduce the size of the prompt
103+
2. instructions on how to format `expected_response` (e.g., `query_params`)
104+
3. how to use the **Request Pattern** to send email, subject to logic (see Logic, below):
102105

106+
![2-tool-context-from-LLM](images/integration/mcp/2-tool-context-from-LLM.png) The LLM returns a Tool Context completion (response), with the steps to call the MCP Server Executor, which here is the logic-enabled API Logic Server JSON:API.
107+
![2-tool-context-from-LLM](images/integration/mcp/2-tool-context-from-LLM-d.png)
103108
#### 3 - Invoke MCP Server
104109

105110
The calls include GET, and a POST for each returned row.
@@ -111,36 +116,64 @@ The calls include GET, and a POST for each returned row.
111116

112117
MCP is capable of executing email directly, but we have business policies providing for email opt-outs. We must respect this logic.
113118

114-
As shown below, a common logic pattern is a `Request Object`: you insert it, it's business logic runs. Here, the logic checks the opt-out, and sends the mail:
119+
As shown below, a common logic pattern is a `Request Object`: you insert a row, triggering its business logic. Here, the logic (an *after_flush* event) checks the opt-out, and sends the mail (stubbed):
115120

116121
![3a-email-logic](images/integration/mcp/3a-email-logic.png)
117122

118123
 
119124

120-
### Status: Work In Progress
125+
## Appendix: Status - Work In Progress
121126

122-
This is an initial experiment, with a number of ToDo's: real filtering, update, etc. That said, it might be an excellent way to explore MCP.
127+
This is an initial experiment, with significant ToDo's (security, etc). That said, it might be an excellent way to explore MCP.
123128

124129
We welcome participation in this exploration. Please contact us via [discord](https://discord.gg/HcGxbBsgRF).
125130

126-
This exploration is changing rapidly. For updates, replace `integration/mcp` from [integration/msp](https://github.com/ApiLogicServer/ApiLogicServer-src/tree/main/api_logic_server_cli/prototypes/nw_no_cust/integration/mcp){:target="_blank" rel="noopener"}.
127-
128131
 
129132

130133
## Appendix: MCP Background
131134

132-
133-
134135
For more information:
135136

136-
- [see MCP Introduction](https://modelcontextprotocol.io/introduction)
137+
1. [see MCP Introduction](https://modelcontextprotocol.io/introduction)
138+
139+
2. [and here](https://apilogicserver.github.io/Docs/Integration-MCP/)
140+
141+
3. [and here](https://www.youtube.com/watch?v=1bUy-1hGZpI&t=72s)
142+
143+
4. and this [N8N link](https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/?utm_source=n8n_app&utm_medium=node_settings_modal-credential_link&utm_campaign=%40n8n%2Fn8n-nodes-langchain.mcpTriggerlangchain.mcpTriggerlangchain.mcpTrigger)
144+
145+
5. and this [python sdk](https://github.com/modelcontextprotocol/python-sdk)
146+
147+
6. and [this video](https://www.youtube.com/shorts/xdMVgZfZ1yg)
148+
149+
 
150+
151+
## Appendix: Key Clarifications
152+
153+
MCP is a new technology. In my learning curve, I found much of the information a little bit vague and in some cases, misleading. The sections below identify key clarifications to incorrect assumptions I had made, so I have listed them below in hopes they can help you.
137154

138-
- [and here](https://apilogicserver.github.io/Docs/Integration-MCP/)
155+
 
156+
157+
### App-Specific Client Executor
158+
159+
Several articles described the Client Executor as a "host such as Claude". That lead me to believe that common chat apps could call MCPs.
160+
161+
Later, I discovered that most chat apps cannot call http, and so cannot directly call MCPs. The Client Executor is analogous to a chat, but is written specifically for MCP use.
162+
163+
 
164+
165+
### Client Executor (not LLM) calls the MCP
166+
167+
I saw several diagrams with arrows from the LLM to the MCP. That lead me to believe that the LLM *calls* the MCP.
168+
169+
Later, I realized that the LLM is just preparing the Tool Context. The Client Executor uses this to invoke the MCP. I now think of the arrow as "knows how to include it in the Tool Context".
170+
171+
 
139172

140-
- [and here](https://www.youtube.com/watch?v=1bUy-1hGZpI&t=72s)
173+
### Server Executor == *logic-enabled* APIs
141174

142-
- and this [N8N link](https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/?utm_source=n8n_app&utm_medium=node_settings_modal-credential_link&utm_campaign=%40n8n%2Fn8n-nodes-langchain.mcpTriggerlangchain.mcpTriggerlangchain.mcpTrigger)
175+
Finally, I presumed that the Client Executor would pass the Tool Context to the LLM. This was simply a bad guess.
143176

144-
- and this [python sdk](https://github.com/modelcontextprotocol/python-sdk)
177+
The key idea is that one specific Server Executor would not be aware it is part of an orchestration. In the case of database APIs, the Server Executor is the set of logic-enabled endpoints identified in the discovery schema.
145178

146-
- and [this video](https://www.youtube.com/shorts/xdMVgZfZ1yg)
179+
Note the logic here is critical. The Client Executor can not / should not "reach in" and be aware of logic specific to each involved database.
Loading
Loading
Loading
Loading
Loading

0 commit comments

Comments
 (0)