Skip to content

Commit

Permalink
fix: max_tokens_per_request error when pushing many documents at once
Browse files Browse the repository at this point in the history
Resolves timescale#481
  • Loading branch information
kolaente committed Feb 14, 2025
1 parent 073fa5b commit 1e83b35
Showing 1 changed file with 8 additions and 0 deletions.
8 changes: 8 additions & 0 deletions projects/pgai/pgai/vectorizer/embedders/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -124,6 +124,14 @@ async def embed(
if not isinstance(msg, str):
raise e

error_type: Any = body["type"]
if isinstance(error_type, str) and error_type == "max_tokens_per_request":
mid = len(documents) // 2

await self.embed(documents[: mid + 1])

return await self.embed(documents[mid + 1 :])

m = openai_token_length_regex.match(msg)
if not m:
raise e
Expand Down

0 comments on commit 1e83b35

Please sign in to comment.