Skip to content

Commit 3cc0dd0

Browse files
authored
Update markUnsafe.md
1 parent b28940a commit 3cc0dd0

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/markUnsafe.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ To flag input as unsafe, you can use the `markUnsafe` function. This is useful w
55
```js
66
import Zen from "@aikidosec/firewall";
77
import OpenAI from "openai";
8-
import { promises } from "fs/promises";
8+
import { readFile } from "fs/promises";
99

1010
const openai = new OpenAI();
1111

@@ -46,7 +46,7 @@ Zen.markUnsafe(filepath);
4646

4747
// This will be blocked if the LLM tries to perform path traversal
4848
// e.g. if filepath is "../../../etc/passwd"
49-
await fs.readFile(filepath);
49+
await readFile(filepath);
5050
```
5151

5252
This example shows how to protect against path traversal attacks when using OpenAI's function calling feature. The LLM might try to access sensitive files using path traversal (e.g., `../../../etc/passwd`), but Zen will detect and block these attempts.

0 commit comments

Comments
 (0)