-
-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Experiment with allowing standard schemas to validate endpoint values #4864
base: master
Are you sure you want to change the base?
Conversation
Review or Edit in CodeSandboxOpen the branch in Web Editor • VS Code • Insiders |
This pull request is automatically built and testable in CodeSandbox. To see build info of the built libraries, click here or the icon next to each commit SHA. Latest deployment of this branch, based on commit b53b271:
|
size-limit report 📦
|
✅ Deploy Preview for redux-starter-kit-docs ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
FWIW, I really like where your head is at in the second example you proposed.
Input/request validation is probably out of scope for a data fetching solution anyway. If inaccurate types are an issue in someone's application, it should probably be addressed at the source of the problem, not as a last ditch effort right before request is placed. The CPU overhead of validating all inputs of all requests probably isn’t great either, especially if this is occurring on main thread. I definitely prefer the terminology |
true - |
Here is link to my initial idea pitch just so there is a paper trail: Will respond directly to this PR for now on for brainstorming. |
It might be overkill, but i'm really enticed by the idea of performing schema validation in a web worker (off main thread). I have seen brief main thread lag when validating massive payloads from server (10MB+). One of the projects I work on professionally is a Tableau-esque data viewer where user can view and interact with up to 500k geospatial results. One of my Attached below is my rough attempt. The web worker was usable via promise. Usage inside API slice
workerPromise.ts
workerLogic.ts
Getting web worker to work with a variety of build tooling could be a pain though, particularly this kind of thing. My example above is obviously for a different use case (transforming a response instead of validating a payload), but I figure is worth sharing. |
As for question about whether validation would occur before transformResponse or after: "I would imagine responseSchema validating what data came back from the server pre-transformResponse. If a transformResponse function is provided, perhaps the RTKQ endpoint will use the return type of their provided transformResponse function. If no transformResponse function is provided, the RTKQ endpoint return type will be whatever was provided as responseSchema." If |
Definitely not something that makes sense for us to look into, but we would just accept any standard schema compliant value, so you're welcome to make your own: declare function validateInWorker(input: unknown): Promise<StandardSchemaV1.Result<string>>
const workerSchema: StandardSchemaV1<string> = {
'~standard': {
version: 1,
vendor: "custom",
validate: validateInWorker
}
} 😄 |
All schemas are optional - the difficult thing with Currently I have it as The difficulty with that is that build.query({
query: ({ id }: { id: number }) => `posts/${id}`,
rawResponseSchema: postSchema
}) however, this doesn't seem like a huge issue to me, as an ideal state of affairs would be build.query({
query: ({ id }: { id: number }) => `posts/${id}`,
rawResponseSchema: postSchema,
transformResponse: (res /* inferred from rawResponseSchema */) => transformed // provides final data type
})
// or no transformResponse
build.query({
query: ({ id }: { id: number }) => `posts/${id}`,
responseSchema: postSchema,
}) but this isn't currently possible (the inference part at least). The closest you'd get is manually annotating build.query({
query: ({ id }: { id: number }) => `posts/${id}`,
rawResponseSchema: postSchema,
transformResponse: (res: Infer<typeof postSchema>) => transformed // provides final data type
}) |
…ird type parameter)
cool, that wasn't actually too bad 😄 // without transformResponse
build.query({
query,
responseSchema
})
// with transformResponse
build.query({
query,
rawResponseSchema,
transformResponse
}) We could use a union to enforce this, but honestly the types are complicated enough and i think some people may still want to use both. |
Is it not possible due to limitations in standard schema specification, the way RTKQ is currently structured, or due to limitation in TypeScript language itself? |
Due to how it's currently structured - I've just pushed a change adding the inference :) |
Have’t thought too deeply about it, but agreed about not using union. Adding the ability to provide both rawResponseSchema and regular responseSchema may not a be common use-case, but probably wouldn't hurt to have. |
Ability to provide a global callback for situation where schema validation fails would be super valuable btw. Would allow for warning toast, logging to external service, etc. Callback would contain information about the request such as endpoint name, request payload, etc. Ability to enable/disable schema validation on global and per-endpoint basis may be nice too. Similar to how RTKQ has a global |
Added - callback parameters are |
Type-wise, schemas are required to match the type of the value they're validating - they can't change a string to a number for example. Transformations that preserve the correct type are allowed, however.
Schemas can also be used as a source of inference, for example
Schema parse failures are treated as unhandled errors, i.e. they're thrown and serialized rather than being returned. This is mainly because we have no idea what the base query error shape should look like, so we can't match it.
For transformable values,
raw<value>Schema
checks the value before transformation, and the<value>Schema
checks the possibly transformed value.