Skip to content

Commit 6e81b12

Browse files
authored
Implement asyncio for db control plane (#442)
## Problem Previous work implemented asyncio for the db data plane, and now we want to roll out a similar approach for the db control plane and inference as well. ## Solution - Extract request construction logic out of `Pinecone` and move it to a request factory - Implement `PineconeAsyncio` using the request factory to keep most of the method-specific logic the same. - Add new integration tests using the asyncio code path. These are mostly modified from the existing serverless integration tests. - Update tests for the asyncio index client to reflect new setup steps - Some refactorings around async context management to address log warnings being shown from aiohttp ## Usage The async version of the client has some async setup/teardown related to the underlying aiohttp library being used. You can either use the `async with` syntax to have the async context automatically managed for you. Or, if you prefer, you can take the responsibility to close the async context yourself by using `close()`. #### Context management option 1: Using `async with` ```python import asyncio from pinecone import ( PineconeAsyncio, ServerlessSpec, CloudProvider, AwsRegion, ) async def main(): async with PineconeAsyncio(api_key="key") as pc: await pc.create_index( name="my-index", metric="cosine", spec=ServerlessSpec( cloud=CloudProvider.AWS, region=AwsRegion.US_EAST_1 ), ) asyncio.run(main()) ``` #### Context management option 2: Manually `close()` ```python import asyncio from pinecone import ( PineconeAsyncio, ServerlessSpec, CloudProvider, AwsRegion, ) async def main(): pc = PineconeAsyncio(api_key="key") await pc.create_index( name="my-index", metric="cosine", spec=ServerlessSpec( cloud=CloudProvider.AWS, region=AwsRegion.US_EAST_1 ), ) await pc.close() # <-- Don't forget to close the client when you are done mking network calls asyncio.run(main()) ``` #### Sparse index example ```python import asyncio import random from pinecone import ( PineconeAsyncio, ServerlessSpec, CloudProvider, AwsRegion, Metric, VectorType, Vector, SparseValues, ) async def main(): async with PineconeAsyncio() as pc: # Create a sparse index index_name = "my-index2" if not await pc.has_index(index_name): await pc.create_index( name=index_name, metric=Metric.DOTPRODUCT, spec=ServerlessSpec( cloud=CloudProvider.AWS, region=AwsRegion.US_EAST_1 ), vector_type=VectorType.SPARSE, tags={ "env": "testing", } ) # Get the index host description = await pc.describe_index(name=index_name) # Make an index client async with pc.Index(host=description.host) as idx: # Upsert some sparse vectors await idx.upsert( vectors=[ Vector( id=str(i), sparse_values=SparseValues( indices=[j for j in range(100)], values=[random.random() for _ in range(100)] ) ) for i in range(50) ] ) # Query the index query_results = await idx.query( top_k=5, sparse_vector=SparseValues( indices=[5, 10, 20], values=[0.5, 0.5, 0.5] ), ) print(query_results) asyncio.run(main()) ``` ## Type of Change - [x] New feature (non-breaking change which adds functionality)
1 parent 25d3d4b commit 6e81b12

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

46 files changed

+2676
-562
lines changed

.github/workflows/testing-integration.yaml

+47-27
Original file line numberDiff line numberDiff line change
@@ -3,50 +3,41 @@ name: "Integration Tests"
33
workflow_call: {}
44

55
jobs:
6-
data-plane-serverless:
6+
db-data-serverless:
77
name: Data plane serverless integration tests
88
runs-on: ubuntu-latest
99
strategy:
1010
fail-fast: false
1111
matrix:
1212
python_version: [3.9, 3.12]
1313
use_grpc: [true, false]
14-
metric:
15-
- cosine
16-
# - euclidean
17-
# - dotproduct
18-
spec:
19-
- '{ "serverless": { "region": "us-west-2", "cloud": "aws" }}'
2014
steps:
2115
- uses: actions/checkout@v4
2216
- uses: ./.github/actions/test-data-plane
2317
with:
24-
DATADOG_API_KEY: '${{ secrets.DATADOG_API_KEY }}'
2518
python_version: '${{ matrix.python_version }}'
2619
use_grpc: '${{ matrix.use_grpc }}'
27-
metric: '${{ matrix.metric }}'
28-
spec: '${{ matrix.spec }}'
20+
metric: 'cosine'
21+
spec: '{ "serverless": { "region": "us-west-2", "cloud": "aws" }}'
2922
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY }}'
3023
freshness_timeout_seconds: 600
3124
skip_weird_id_tests: 'true'
3225

33-
test-asyncio:
26+
db-data-asyncio:
3427
name: Data plane asyncio
3528
runs-on: ubuntu-latest
3629
strategy:
3730
fail-fast: false
3831
matrix:
3932
python_version: [3.9, 3.12]
4033
use_grpc: [false, true]
41-
spec:
42-
- '{ "serverless": { "region": "us-west-2", "cloud": "aws" }}'
4334
steps:
4435
- uses: actions/checkout@v4
4536
- uses: ./.github/actions/test-asyncio
4637
with:
4738
python_version: '${{ matrix.python_version }}'
4839
use_grpc: '${{ matrix.use_grpc }}'
49-
spec: '${{ matrix.spec }}'
40+
spec: '{ "serverless": { "region": "us-west-2", "cloud": "aws" }}'
5041
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY }}'
5142
freshness_timeout_seconds: 600
5243

@@ -112,13 +103,11 @@ jobs:
112103
DIMENSION: 10
113104
METRIC: 'cosine'
114105

115-
control-rest-serverless:
106+
db-control-rest-serverless:
116107
name: control plane serverless
117108
runs-on: ubuntu-latest
118109
strategy:
119110
matrix:
120-
pineconeEnv:
121-
- prod
122111
testConfig:
123112
- python-version: 3.9 # Do one test run with 3.9 for sanity check
124113
pod: { environment: 'us-east1-gcp'}
@@ -136,21 +125,52 @@ jobs:
136125
- name: Setup Poetry
137126
uses: ./.github/actions/setup-poetry
138127
- name: 'Run integration tests (REST, prod)'
139-
if: matrix.pineconeEnv == 'prod'
140128
run: poetry run pytest tests/integration/control/serverless -s -vv
141129
env:
142130
PINECONE_DEBUG_CURL: 'true'
143131
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY }}'
144-
GITHUB_BUILD_NUMBER: '${{ github.run_number }}-p-${{ matrix.testConfig.python-version}}'
145132
SERVERLESS_CLOUD: '${{ matrix.testConfig.serverless.cloud }}'
146133
SERVERLESS_REGION: '${{ matrix.testConfig.serverless.region }}'
147-
- name: 'Run integration tests (REST, staging)'
148-
if: matrix.pineconeEnv == 'staging'
149-
run: poetry run pytest tests/integration/control/serverless -s -vv
134+
135+
db-control-asyncio:
136+
name: db control asyncio
137+
runs-on: ubuntu-latest
138+
strategy:
139+
matrix:
140+
python_version:
141+
- 3.9
142+
- 3.12
143+
fail-fast: false
144+
steps:
145+
- uses: actions/checkout@v4
146+
- name: 'Set up Python ${{ matrix.python_version }}'
147+
uses: actions/setup-python@v5
148+
with:
149+
python-version: '${{ matrix.python_version }}'
150+
- name: Setup Poetry
151+
uses: ./.github/actions/setup-poetry
152+
- name: 'Run integration tests (asyncio, prod)'
153+
run: poetry run pytest tests/integration/control_asyncio -s -vv
150154
env:
151155
PINECONE_DEBUG_CURL: 'true'
152-
PINECONE_CONTROLLER_HOST: 'https://api-staging.pinecone.io'
153-
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY_STAGING }}'
154-
GITHUB_BUILD_NUMBER: '${{ github.run_number }}-s-${{ matrix.testConfig.python-version}}'
155-
SERVERLESS_CLOUD: '${{ matrix.testConfig.serverless.cloud }}'
156-
SERVERLESS_REGION: '${{ matrix.testConfig.serverless.region }}'
156+
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY }}'
157+
158+
inference:
159+
name: Inference tests
160+
runs-on: ubuntu-latest
161+
strategy:
162+
matrix:
163+
python_version: [3.9, 3.12]
164+
steps:
165+
- uses: actions/checkout@v4
166+
- name: 'Set up Python ${{ matrix.python_version }}'
167+
uses: actions/setup-python@v5
168+
with:
169+
python-version: '${{ matrix.python_version }}'
170+
- name: Setup Poetry
171+
uses: ./.github/actions/setup-poetry
172+
- name: 'Run integration tests'
173+
run: poetry run pytest tests/integration/inference -s -vv
174+
env:
175+
PINECONE_DEBUG_CURL: 'true'
176+
PINECONE_API_KEY: '${{ secrets.PINECONE_API_KEY }}'

pinecone/control/__init__.py

+1
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
from .pinecone import Pinecone
2+
from .pinecone_asyncio import PineconeAsyncio
23

34
from .repr_overrides import install_repr_overrides
45

0 commit comments

Comments
 (0)