Skip to content

Commit 5897d9e

Browse files
Update all prompts to use the extractor naming convention
1 parent 6609cfa commit 5897d9e

File tree

13 files changed

+94
-61
lines changed

13 files changed

+94
-61
lines changed

functions/docker_scout_tag_recommendation/init.clj

+13-2
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,25 @@
11
(ns init
22
(:require
33
[babashka.fs :as fs]
4-
[babashka.process :as process]
4+
[babashka.http-client :as http]
55
[cheshire.core :as json]
66
[clojure.string :as string]))
77

8+
(def language-gateway-endpoint "https://api.scout.docker.com/v1/language-gateway")
9+
10+
(defn- recommendation-request [image]
11+
(http/post
12+
(format "%s/%s" language-gateway-endpoint "image")
13+
{:body (json/generate-string {:image image})}))
14+
15+
(defn- recommendation-response [response]
16+
response)
17+
818
(defn -command [& args]
919
(try
1020
(let [repository (:repository (json/parse-string (second args) true))]
11-
(println "22-slim"))
21+
(println
22+
((comp recommendation-response recommendation-request) repository)))
1223
(catch Throwable t
1324
(binding [*out* *err*]
1425
(println t))

prompts/clj-kondo/README.md

+5-16
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
extractors:
3-
- name: go-linguist
3+
- name: linguist
44
- name: project-facts
55
model: gpt-4
66
stream: true
@@ -23,38 +23,27 @@ functions:
2323

2424
Ask about violations in a project that contains clojure code.
2525

26-
## Running the tool
26+
## How to run
2727

2828
```sh
29+
# docker:command=clj-kondo
2930
docker run --rm \
3031
-it \
3132
-v /var/run/docker.sock:/var/run/docker.sock \
3233
--mount type=volume,source=docker-prompts,target=/prompts \
3334
--mount type=bind,source=$HOME/.openai-api-key,target=/root/.openai-api-key \
34-
--mount type=bind,source=/Users/slim/docker/labs-ai-tools-for-devs/prompts,target=/app/local \
35+
--mount type=bind,source=$PWD,target=/app/local \
3536
--workdir /app \
3637
vonwig/prompts:local run \
3738
--host-dir /Users/slim/docker/labs-ai-tools-for-devs \
3839
--user jimclark106 \
3940
--platform "$(uname -o)" \
40-
--prompts local/clj-kondo \
41+
--prompts local \
4142
--pat "$(cat ~/.secrets/dockerhub-pat-ai-tools-for-devs.txt)" \
4243
--thread-id "clj-kondo"
4344
```
4445

45-
```sh
46-
bb -m prompts run \
47-
--host-dir /Users/slim/docker/labs-ai-tools-for-devs \
48-
--user jimclark106 \
49-
--platform "$(uname -o)" \
50-
--prompts-dir prompts/clj-kondo \
51-
--url http://localhost:11434/v1/chat/completions \
52-
--model "llama3-groq-tool-use:latest" \
53-
--thread-id "clj-kondo"
54-
```
55-
5646
## TODO
5747

5848
- [ ] the clj-kondo function is downloading into an `\?/.m2` repository in the project root. We can't have this.
5949

60-

prompts/curl/README.md

+9-9
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
extractors:
3-
- name: go-linguist
3+
- name: linguist
44
- name: docker-lsp
55
model: gpt-4
66
stream: true
@@ -28,20 +28,20 @@ Also, what about defining outcomes and having the tool verify that we actually r
2828

2929
At the end, we should report the command line, and the version of curl that we used.
3030

31-
## Running the tool
31+
## How to run
3232

3333
```sh
34-
DIR=$PWD
34+
# docker:command=curl
3535
docker run --rm -it \
3636
-v /var/run/docker.sock:/var/run/docker.sock \
3737
--mount type=volume,source=docker-prompts,target=/prompts \
3838
--mount type=bind,source=$HOME/.openai-api-key,target=/root/.openai-api-key \
39-
--mount type=bind,source=/Users/slim/docker/labs-make-runbook/prompts,target=/my-prompts \
40-
--workdir /my-prompts \
39+
--mount type=bind,source=$PWD,target=/app/local \
40+
--workdir /app \
4141
vonwig/prompts:latest run \
42-
$DIR \
43-
$USER \
44-
"$(uname -o)" \
45-
project_type
42+
--host-dir $PWD \
43+
--user $USER \
44+
--platform "$(uname -o)" \
45+
--prompts-dir local
4646
# "github:docker/labs-make-runbook?ref=main&path=prompts/curl"
4747
```

prompts/docker/020_user_prompt.md

+6-6
Original file line numberDiff line numberDiff line change
@@ -18,32 +18,32 @@ I'm logged in to Docker Hub as {{username}}
1818

1919
My project has the following Dockerfiles:
2020

21-
{{#project.dockerfiles}}
21+
{{#project-facts.dockerfiles}}
2222
--- Dockerfile ---
2323
Dockerfile at `./{{path}}` contains:
2424

2525
```dockerfile
2626
{{content}}
2727
```
2828

29-
{{/project.dockerfiles}}
29+
{{/project-facts.dockerfiles}}
3030

3131
--- Docker Compose Files ---
3232

33-
{{#project.composefiles}}
33+
{{#project-facts.composefiles}}
3434
--- Compose File ---
3535
Compose file at `./{{path}}` contains:
3636

3737
```composefile
3838
{{content}}
3939
```
4040

41-
{{/project.composefiles}}
42-
{{^project.composefiles}}
41+
{{/project-facts.composefiles}}
42+
{{^project-facts.composefiles}}
4343

4444
I am not using Docker Compose in this project.
4545

46-
{{/project.composefiles}}
46+
{{/project-facts.composefiles}}
4747

4848
My project uses the following languages:
4949

prompts/docker/README.md

+18-4
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,24 @@ extractors:
33
- name: project-facts
44
---
55

6-
## Description
6+
# Background
77

8-
The prompts for docker rely only on the classic lsp project extraction function.
8+
Generate a docker runbook for a project.
99

10-
The output of running this container is a json document that will be merged into the
11-
context that is provided to the moustache template based prompts.
10+
## How to run
11+
12+
```sh
13+
# docker:command=curl
14+
docker run --rm -it \
15+
-v /var/run/docker.sock:/var/run/docker.sock \
16+
--mount type=volume,source=docker-prompts,target=/prompts \
17+
--mount type=bind,source=$HOME/.openai-api-key,target=/root/.openai-api-key \
18+
--mount type=bind,source=$PWD,target=/app/local \
19+
--workdir /app \
20+
vonwig/prompts:latest run \
21+
--host-dir ~/docker/lsp \
22+
--user $USER \
23+
--platform "$(uname -o)" \
24+
--prompts-dir local
25+
```
1226

prompts/dockerfiles/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
extractors:
3-
- name: go-linguist
3+
- name: linguist
44
tool_choice: auto
55
model: gpt-4
66
stream: true

prompts/dockerfiles_llama3.1/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
extractors:
3-
- name: go-linguist
3+
- name: linguist
44
tool_choice: auto
55
model: gpt-4
66
stream: true

prompts/dockerfiles_mistral-nemo/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
extractors:
3-
- name: go-linguist
3+
- name: linguist
44
tool_choice: auto
55
model: gpt-4
66
stream: true
+1-1
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
What is the recommended tag to use for the `hazy` repository? Use the output from the recommendation function to answer this question.
1+
What is the recommended tag to use for the `node` repository? Use the output from the recommendation function to answer this question.

prompts/recommended_tags/README.md

+4-1
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ functions:
88
- name: docker_scout_tag_recommendation
99
---
1010

11-
# Background
11+
# How to Run
1212

1313
```sh
1414
# docker:command=recommended-tags
@@ -20,3 +20,6 @@ bb -m prompts run \
2020
--nostream
2121
```
2222

23+
```clj
24+
(core.println "hey")
25+
```

runbook.md

+9
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,15 @@ bb -m prompts --host-dir /Users/slim/docker/labs-ai-tools-for-devs \
2424
--pretty-print-prompts
2525
```
2626

27+
```sh
28+
bb -m prompts --host-dir /Users/slim/docker/labs-ai-tools-for-devs \
29+
--user jimclark106 \
30+
--platform darwin \
31+
--prompts-dir prompts/project_type/ \
32+
--pretty-print-prompts
33+
```
34+
35+
2736
### Running prompts/dockerfiles Conversation Loops
2837

2938
#### test prompts/project_type

src/prompts.clj

+23-16
Original file line numberDiff line numberDiff line change
@@ -29,10 +29,10 @@
2929
(medley/deep-merge
3030
{:platform platform
3131
:username user
32-
:project {:files (-> project-facts :project/files)
33-
:dockerfiles (-> project-facts :project/dockerfiles)
34-
:composefiles (-> project-facts :project/composefiles)
35-
:languages (-> project-facts :github/lingust)}
32+
:project-facts {:files (-> project-facts :project-facts :project/files)
33+
:dockerfiles (-> project-facts :project-facts :project/dockerfiles)
34+
:composefiles (-> project-facts :project-facts :project/composefiles)
35+
:languages (-> project-facts :project-facts :github/lingust)}
3636
:languages (->> project-facts
3737
:github/linguist
3838
keys
@@ -68,9 +68,14 @@
6868
(-> container-definition
6969
(assoc :host-dir dir)))]
7070
(when (= 0 exit-code)
71-
(case (:output-handler container-definition)
72-
"linguist" (->> (json/parse-string pty-output keyword) vals (into []) (assoc {} :linguist))
73-
(json/parse-string pty-output keyword)))))
71+
(let [context
72+
(case (:output-handler container-definition)
73+
;; we have one output-handler registered right now - it extracts the vals from a map
74+
"linguist" (->> (json/parse-string pty-output keyword) vals (into []))
75+
(json/parse-string pty-output keyword))]
76+
(if-let [extractor-name (:name container-definition)]
77+
{(keyword extractor-name) context}
78+
context)))))
7479
(catch Throwable ex
7580
(warn
7681
"unable to run extractors \n```\n{{ container-definition }}\n```\n - {{ exception }}"
@@ -85,7 +90,8 @@
8590
(map (fn [m] (merge (registry/get-extractor m) m))))]
8691
(if (seq extractors)
8792
extractors
88-
[{:image "docker/lsp:latest"
93+
[{:name "project-facts"
94+
:image "docker/lsp:latest"
8995
:entrypoint "/app/result/bin/docker-lsp"
9096
:command ["project-facts"
9197
"--vs-machine-id" "none"
@@ -172,7 +178,7 @@
172178
(jsonrpc/notify :message {:content (format "## (%s) sub-prompt" (:ref definition))})
173179
(let [{:keys [messages _finish-reason] :as m}
174180
(async/<!! (conversation-loop
175-
(assoc opts :prompts-dir (git/prompt-dir (:ref definition)))))]
181+
(assoc opts :prompts-dir (git/prompt-dir (:ref definition)))))]
176182
(jsonrpc/notify :message {:content (format "## (%s) end sub-prompt" (:ref definition))})
177183
(resolve (->> messages
178184
(filter #(= "assistant" (:role %)))
@@ -201,11 +207,12 @@
201207
(try
202208
(openai/openai
203209
(merge
204-
m
205-
{:messages prompts :stream stream}
210+
m
211+
{:messages prompts}
206212
(when (seq functions) {:tools functions})
207213
(when url {:url url})
208-
(when model {:model model})) h)
214+
(when model {:model model})
215+
(when (and stream (nil? (:stream m))) {:stream stream})) h)
209216
(catch ConnectException _
210217
;; when the conversation-loop can not connect to an openai compatible endpoint
211218
(async/>!! c {:messages [{:role "assistant" :content "I cannot connect to an openai compatible endpoint."}]
@@ -276,11 +283,11 @@
276283
[nil "--thread-id THREAD_ID" "use this thread-id for the next conversation"
277284
:assoc-fn (fn [m k v] (assoc m k v :save-thread-volume true))]
278285
[nil "--model MODEL" "use this model on the openai compatible endpoint"]
279-
[nil "--stream" "stream responses"
286+
[nil "--stream" "stream responses"
280287
:id :stream
281288
:default true
282289
:assoc-fn (fn [m k _] (assoc m k true))]
283-
[nil "--nostream" "disable streaming responses"
290+
[nil "--nostream" "disable streaming responses"
284291
:id :stream
285292
:assoc-fn (fn [m k _] (assoc m k false))]
286293
[nil "--debug" "add debug logging"]
@@ -377,8 +384,8 @@
377384
(System/exit 0))
378385
(let [cmd (apply command options arguments)]
379386
(alter-var-root
380-
#'jsonrpc/notify
381-
(fn [_] (partial (if (:jsonrpc options) jsonrpc/-notify jsonrpc/-println) options)))
387+
#'jsonrpc/notify
388+
(fn [_] (partial (if (:jsonrpc options) jsonrpc/-notify jsonrpc/-println) options)))
382389
((if (:pretty-print-prompts options) output-prompts output-handler)
383390
(cmd))))))
384391
(catch Throwable t

test/prompts_t.clj

+3-3
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
(->>
2525
(prompts/fact-reducer "/Users/slim/docker/labs-make-runbook"
2626
{}
27-
{:name "go-linguist"
27+
{:name "linguist"
2828
:image "vonwig/go-linguist:latest"
2929
:command ["-json"]
3030
:output-handler "linguist"
@@ -40,11 +40,11 @@
4040
(t/is
4141
(=
4242
(prompts/collect-extractors "prompts/docker")
43-
'({:name "project-facts", :image "docker/lsp:latest", :entrypoint "/app/result/bin/docker-lsp", :command ["project-facts" "--vs-machine-id" "none" "--workspace" "/docker"]})))
43+
'({:name "project-facts", :image "docker/lsp:latest", :entrypoint "/app/result/bin/docker-lsp", :command ["project-facts" "--vs-machine-id" "none" "--workspace" "/project"]})))
4444
(t/is
4545
(=
4646
(prompts/collect-extractors "prompts/dockerfiles")
47-
'({:name "go-linguist", :image "vonwig/go-linguist:latest", :command ["-json"], :output-handler "linguist"}))))
47+
'({:name "linguist", :image "vonwig/go-linguist:latest", :command ["-json"], :output-handler "linguist"}))))
4848

4949
(comment
5050
(prompts/run-extractors

0 commit comments

Comments
 (0)