|
| 1 | +--- |
| 2 | +title: Introducing xsAI,<br />a < 6KB Vercel AI SDK alternative |
| 3 | +date: 2025-03-03 |
| 4 | +author: 藍+85CD, Neko Ayaka |
| 5 | +tags: |
| 6 | + - Announcements |
| 7 | +--- |
| 8 | + |
| 9 | +## Why another AI SDK? |
| 10 | + |
| 11 | +[Vercel AI SDK](https://sdk.vercel.ai/) is way too big, it includes unnecessary dependencies. |
| 12 | + |
| 13 | +[](https://pkg-size.dev/ai@4.1.47) |
| 14 | + |
| 15 | +For example, Vercel AI SDK shipped with non-optional request and response validation, non-optional |
| 16 | +[OpenTelemetry](https://opentelemetry.io/) dependencies, and bind the user to use [zod](https://zod.dev/) (you don't |
| 17 | +get to choose), and so much more... |
| 18 | + |
| 19 | +This makes it hard to build small and decent AI applications & CLI tools with less bundle size and more controllable |
| 20 | +and atomic capabilities that user truly needed. |
| 21 | + |
| 22 | +But, it doesn't need to be like this, isn't it? |
| 23 | + |
| 24 | +### So how small is xsAI? |
| 25 | + |
| 26 | +Without further ado, let's look: |
| 27 | + |
| 28 | +[](https://pkg-size.dev/xsai@0.1.0-beta.9) |
| 29 | + |
| 30 | +It's roughly a hundred times smaller than the Vercel AI SDK (*install size) and has most of its features. |
| 31 | + |
| 32 | +Also it is 5.7KB gzipped, so the title is not wrong. |
| 33 | + |
| 34 | +[](https://pkg-size.dev/xsai@0.1.0-beta.9) |
| 35 | + |
| 36 | +## Getting started |
| 37 | + |
| 38 | +You can install the `xsai` package, which contains all the core utils. |
| 39 | + |
| 40 | +```bash |
| 41 | +npm i xsai |
| 42 | +``` |
| 43 | + |
| 44 | +Or install the corresponding packages separately according to the required |
| 45 | +features: |
| 46 | + |
| 47 | +```bash |
| 48 | +npm i @xsai/generate-text @xsai/embed @xsai/model |
| 49 | +``` |
| 50 | + |
| 51 | +### Generating Text |
| 52 | + |
| 53 | +So let's start with some simple examples. |
| 54 | + |
| 55 | +```ts |
| 56 | +import { generateText } from '@xsai/generate-text' |
| 57 | +import { env } from 'node:process' |
| 58 | + |
| 59 | +const { text } = await generateText({ |
| 60 | + apiKey: env.OPENAI_API_KEY!, |
| 61 | + baseURL: 'https://api.openai.com/v1/', |
| 62 | + model: 'gpt-4o' |
| 63 | + messages: [{ |
| 64 | + role: 'user', |
| 65 | + content: 'Why is the sky blue?', |
| 66 | + }], |
| 67 | +}) |
| 68 | +``` |
| 69 | + |
| 70 | +xsAI does not use the provider function [like Vercel does](https://sdk.vercel.ai/docs/foundations/providers-and-models) by default, |
| 71 | +we simplified them into three shared fields: `apiKey`, `baseURL` and `model`. |
| 72 | + |
| 73 | +- `apiKey`: Provider API Key |
| 74 | +- `baseURL`: Provider Base URL (will be merged with the path of the corresponding util, e.g. `new URL('chat/completions', 'https://api.openai.com/v1/')`) |
| 75 | +- `model`: Name of the model to use |
| 76 | + |
| 77 | +> Don't worry if you need to support non-OpenAI-compatible API provider, such as [Claude](https://claude.ai/), we left the possibilities to override |
| 78 | +> [`fetch(...)`](https://developer.mozilla.org/en-US/docs/Web/API/Fetch_API/Using_Fetch) where you can customize how the request is made, |
| 79 | +> and how the response was handled. |
| 80 | +
|
| 81 | +This allows xsAI to support any OpenAI-compatible API without having to create provider packages. |
| 82 | + |
| 83 | +### Generating Text w/ Tool Calling |
| 84 | + |
| 85 | +Continuing with the example above, we now add the tools. |
| 86 | + |
| 87 | +```ts |
| 88 | +import { generateText } from '@xsai/generate-text' |
| 89 | +import { tool } from '@xsai/tool' |
| 90 | +import { env } from 'node:process' |
| 91 | +import * as z from 'zod' |
| 92 | + |
| 93 | +const weather = await tool({ |
| 94 | + name: 'weather', |
| 95 | + description: 'Get the weather in a location', |
| 96 | + parameters: z.object({ |
| 97 | + location: z.string().describe('The location to get the weather for'), |
| 98 | + }), |
| 99 | + execute: async ({ location }) => ({ |
| 100 | + location, |
| 101 | + temperature: 72 + Math.floor(Math.random() * 21) - 10, |
| 102 | + }), |
| 103 | +}) |
| 104 | + |
| 105 | +const { text } = await generateText({ |
| 106 | + apiKey: env.OPENAI_API_KEY!, |
| 107 | + baseURL: 'https://api.openai.com/v1/', |
| 108 | + model: 'gpt-4o' |
| 109 | + messages: [{ |
| 110 | + role: 'user', |
| 111 | + content: 'What is the weather in San Francisco?', |
| 112 | + }], |
| 113 | + tools: [weather], |
| 114 | +}) |
| 115 | +``` |
| 116 | + |
| 117 | +Wait, [`zod`](https://zod.dev) is not good for tree shaking and annoying. Can we use [`valibot`](https://valibot.dev)? **Of course!** |
| 118 | + |
| 119 | +```ts |
| 120 | +import { tool } from '@xsai/tool' |
| 121 | +import { description, object, pipe, string } from 'valibot' |
| 122 | + |
| 123 | +const weather = await tool({ |
| 124 | + name: 'weather', |
| 125 | + description: 'Get the weather in a location', |
| 126 | + parameters: object({ |
| 127 | + location: pipe( |
| 128 | + string(), |
| 129 | + description('The location to get the weather for'), |
| 130 | + ), |
| 131 | + }), |
| 132 | + execute: async ({ location }) => ({ |
| 133 | + location, |
| 134 | + temperature: 72 + Math.floor(Math.random() * 21) - 10, |
| 135 | + }), |
| 136 | +}) |
| 137 | +``` |
| 138 | + |
| 139 | +We can even use [`arktype`](https://arktype.io), and the list of compatibility will grow in the future: |
| 140 | + |
| 141 | +```ts |
| 142 | +import { tool } from '@xsai/tool' |
| 143 | +import { type } from 'arktype' |
| 144 | + |
| 145 | +const weather = await tool({ |
| 146 | + name: 'weather', |
| 147 | + description: 'Get the weather in a location', |
| 148 | + parameters: type({ |
| 149 | + location: 'string', |
| 150 | + }), |
| 151 | + execute: async ({ location }) => ({ |
| 152 | + location, |
| 153 | + temperature: 72 + Math.floor(Math.random() * 21) - 10, |
| 154 | + }), |
| 155 | +}) |
| 156 | +``` |
| 157 | + |
| 158 | +> xsAI doesn't limit your choices into either [`zod`](https://zod.dev), [`valibot`](https://valibot.dev), or [`arktype`](https://arktype.io), with |
| 159 | +> the power of [Standard Schema](https://github.com/standard-schema/standard-schema), you can use any schema library it supported you like. |
| 160 | +
|
| 161 | +### Easy migration |
| 162 | + |
| 163 | +Are you already using the Vercel AI SDK? Let's see how to migrate to xsAI: |
| 164 | + |
| 165 | +```diff |
| 166 | +- import { openai } from '@ai-sdk/openai' |
| 167 | +- import { generateText, tool } from 'ai' |
| 168 | ++ import { generateText, tool } from 'xsai' |
| 169 | ++ import { env } from 'node:process' |
| 170 | +import * as z from 'zod' |
| 171 | + |
| 172 | +const { text } = await generateText({ |
| 173 | ++ apiKey: env.OPENAI_API_KEY!, |
| 174 | ++ baseURL: 'https://api.openai.com/v1/', |
| 175 | +- model: openai('gpt-4o') |
| 176 | ++ model: 'gpt-4o' |
| 177 | + messages: [{ |
| 178 | + role: 'user', |
| 179 | + content: 'What is the weather in San Francisco?', |
| 180 | + }], |
| 181 | +- tools: { |
| 182 | ++ tools: [ |
| 183 | +- weather: tool({ |
| 184 | ++ await tool({ |
| 185 | ++ name: 'weather', |
| 186 | + description: 'Get the weather in a location', |
| 187 | + parameters: z.object({ |
| 188 | + location: z.string().describe('The location to get the weather for'), |
| 189 | + }), |
| 190 | + execute: async ({ location }) => ({ |
| 191 | + location, |
| 192 | + temperature: 72 + Math.floor(Math.random() * 21) - 10, |
| 193 | + }), |
| 194 | + }) |
| 195 | +- }, |
| 196 | ++ ], |
| 197 | +}) |
| 198 | +``` |
| 199 | + |
| 200 | +That's it! |
| 201 | + |
| 202 | +## Next steps |
| 203 | + |
| 204 | +### Big fan of [Anthropic's MCP](https://www.anthropic.com/news/model-context-protocol)? |
| 205 | + |
| 206 | +We are working on [Model Context Protocol](https://modelcontextprotocol.io/introduction) support: [#84](https://github.com/moeru-ai/xsai/pull/84) |
| 207 | + |
| 208 | +### Don't like any of the cloud provider? |
| 209 | + |
| 210 | +We are working on a [🤗 Transformers.js](https://huggingface.co/docs/transformers.js/index) provider that enables you to directly run LLMs and any |
| 211 | +🤗 Transformers.js supported models directly in browser, with the power of WebGPU! |
| 212 | + |
| 213 | +You can track the progress here: [#41](https://github.com/moeru-ai/xsai/issues/41). It is really cool and playful to run embedding, speech, |
| 214 | +and transcribing models directly in the browser, so, stay tuned! |
| 215 | + |
| 216 | +### Need framework bindings? |
| 217 | + |
| 218 | +We will do this in v0.2. See you next time! |
| 219 | + |
| 220 | +## Documentation |
| 221 | + |
| 222 | +Since this is just an introduction article, it only covers `generate-text` and `tool`. |
| 223 | + |
| 224 | +`xsai` [has more utils:](https://github.com/moeru-ai/xsai/blob/main/packages/xsai/src/index.ts) |
| 225 | + |
| 226 | +```ts |
| 227 | +export * from '@xsai/embed' |
| 228 | +export * from '@xsai/generate-object' |
| 229 | +export * from '@xsai/generate-speech' |
| 230 | +export * from '@xsai/generate-text' |
| 231 | +export * from '@xsai/generate-transcription' |
| 232 | +export * from '@xsai/model' |
| 233 | +export * from '@xsai/shared-chat' |
| 234 | +export * from '@xsai/stream-object' |
| 235 | +export * from '@xsai/stream-text' |
| 236 | +export * from '@xsai/tool' |
| 237 | +export * from '@xsai/utils-chat' |
| 238 | +export * from '@xsai/utils-stream' |
| 239 | +``` |
| 240 | + |
| 241 | +If you are interested, go to the documentation at <https://xsai.js.org/docs> to get started! |
| 242 | + |
| 243 | +Besides xsAI, we made loads of other cool stuff too! Check out our [`moeru-ai` GitHub organization](https://github.com/moeru-ai)! |
0 commit comments