Skip to content

docs: fix links in index.mdx and refactor simple example page #442

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jun 14, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
117 changes: 89 additions & 28 deletions docs/pages/config/simple_example.mdx
Original file line number Diff line number Diff line change
@@ -1,24 +1,43 @@
import { Callout } from 'nextra/components';

Here you can find the most common `open-next.config.ts` file examples that you could easily take as a starting point for your own configuration.

## Streaming with lambda

<Callout type="warning" emoji="⚠️">
AWS has a bunch of different implementations of streaming in production. You
might be lucky and have a fully working one, but you might have one that are
not suitable for production. Be aware that there is nothing to do to prevent
AWS from breaking your deployment (same code and same runtime might break from
one day to another) [Thread
1](https://discord.com/channels/983865673656705025/1230482660913184800)
[Thread
2](https://discord.com/channels/983865673656705025/1249368592558985247) <br />{' '}
<br /> On some AWS accounts the response can hang if the body is empty. There
is a workaround for that on OpenNext 3.0.3+, setting environment variable
`OPEN_NEXT_FORCE_NON_EMPTY_RESPONSE` to `"true"`. This will ensure that the
stream body is not empty. <br /> <br />
If you have an issue with streaming send a message on [discord](https://sst.dev/discord)
and contact AWS support to let them know of the issue.
</Callout>

```ts
import type { OpenNextConfig } from 'open-next/types/open-next'
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
default: {
override: {
wrapper: "aws-lambda-streaming", // This is necessary to enable lambda streaming
default: {
override: {
wrapper: 'aws-lambda-streaming', // This is necessary to enable lambda streaming
},
},
} satisfies OpenNextConfig
} satisfies OpenNextConfig;

export default config;
```

## Splitting the server

```ts
import type { OpenNextConfig } from 'open-next/types/open-next'
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
// This is the default server, similar to the server-function in open-next v2
// In this case we are not providing any override, so it will generate a normal lambda (i.e. no streaming)
Expand All @@ -30,13 +49,55 @@ const config = {
patterns: ['route1', 'route2', 'route3'],
// For app dir, you need to include route|page, no need to include layout or loading
// It needs to be prepended with app/ or pages/ depending on the directory used
routes: ["app/route1/page", "app/route2/page", "pages/route3"],
routes: ['app/route1/page', 'app/route2/page', 'pages/route3'],
override: {
wrapper: "aws-lambda-streaming",
}
wrapper: 'aws-lambda-streaming',
},
},
}
} satisfies OpenNextConfig
},
} satisfies OpenNextConfig;

export default config;
```

## Use aws4fetch instead of AWS sdk

<Callout type="info" emoji="ℹ️">
By default we use S3, DynamoDB and SQS for handling ISR/SSG and the fetch
cache. We interact with them using AWS sdk v3. This can contribute a lot to
the cold start. There is a built-in option to use
[aws4fetch](https://github.com/mhart/aws4fetch) instead of the AWS sdk that
can reduce cold start up to 300ms. Requires `OpenNext v3.0.3+`. Here is how
you enable it:
</Callout>

```ts
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
default: {
override: {
tagCache: 'dynamodb-lite',
incrementalCache: 's3-lite',
queue: 'sqs-lite',
},
},
} satisfies OpenNextConfig;

export default config;
```

## Running in Lambda@Edge

```ts
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
default: {
placement: 'global',
override: {
converter: 'aws-cloudfront',
},
},
} satisfies OpenNextConfig;

export default config;
```
Expand All @@ -45,48 +106,48 @@ export default config;

This will generate 2 server functions, the default one and the edge one. The edge one will still be deployed as a lambda function, but it will be deployed in the edge runtime.

Edge runtime function have less cold start time, but you can only deploy one route per function. They also does not have the middleware bundled in the function, so you need to use external middleware if you need it in front of the edge function.
Edge runtime function have less cold start time, but you can only deploy one route per function. They also do not have the middleware bundled in the function, so you need to use external middleware if you need it in front of the edge function.

```ts
import type { OpenNextConfig } from 'open-next/types/open-next'
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
default: {},
functions: {
edge: {
runtime: "edge",
runtime: 'edge',
//placement: "global", If you want your function to be deployed globally (i.e. lambda@edge) uncomment this line. Otherwise it will be deployed in the region specified in the stack
routes: ["app/api/test/route"],
patterns: ["api/test"],
routes: ['app/api/test/route'],
patterns: ['api/test'],
},
}
} satisfies OpenNextConfig
},
} satisfies OpenNextConfig;

export default config;
```

## External middleware

In some cases (edge runtime, function splitting with some middleware rewrites, etc) you might want to use external middleware.
With the default middleware configuration, it is bundled for a deployment in lambda@edge.
In some cases (edge runtime, function splitting with some middleware rewrites, etc) you might want to use external middleware.
With the default middleware configuration, it is bundled for a deployment in lambda@edge.
This is how you can do it:

```ts
import type { OpenNextConfig } from 'open-next/types/open-next'
import type { OpenNextConfig } from 'open-next/types/open-next';
const config = {
default: {},
functions: {
myFn: {
patterns: ['route1', 'route2', 'route3'],
routes: ["app/route1/page", "app/route2/page", "pages/route3"],
routes: ['app/route1/page', 'app/route2/page', 'pages/route3'],
override: {
wrapper: "aws-lambda-streaming",
}
wrapper: 'aws-lambda-streaming',
},
},
},
middleware: {
external: true
}
} satisfies OpenNextConfig
external: true,
},
} satisfies OpenNextConfig;

export default config;
```
```
15 changes: 8 additions & 7 deletions docs/pages/index.mdx
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
import {SITE} from "../config"
import { Callout } from 'nextra/components'
import { SITE } from '../config';
import { Callout } from 'nextra/components';

<Callout>
This docs is for the V3 of OpenNext. If you are looking for the V2 docs, you can find them [here](/v2).

If you're migrating from V2 to V3, you can find the migration guide [here](/migration#from-opennext-v2).

</Callout>

### Open source Next.js adapter

---


OpenNext takes the Next.js build output and converts it into packages that can be deployed across a variety of environments.
OpenNext takes the Next.js build output and converts it into packages that can be deployed across a variety of environments.
Natively OpenNext has support for AWS Lambda and classic Node Server. It also offer partial support for the `edge` runtime in Cloudflare Workers.

One notable feature of OpenNext is its ability to split the Next.js output, enabling selective deployment to different targets such as AWS Lambda, Cloudflare Workers, or Amazon ECS. This facilitates a tailored deployment strategy that aligns with the specific needs of your application.
Expand All @@ -38,6 +38,7 @@ Closed source SaaS products like [Amplify](https://aws.amazon.com/amplify/) have
---

We need your help keeping it up to date and feature complete. Make sure to [**join us on Discord**](https://sst.dev/discord) and [**star us on GitHub**](https://github.com/sst/open-next).

## Features

OpenNext aims to support all Next.js 14 features. Some features are work in progress. Please open a [new issue](https://github.com/sst/open-next/issues/new) to let us know!
Expand All @@ -47,12 +48,12 @@ OpenNext aims to support all Next.js 14 features. Some features are work in prog
- [x] Dynamic routes
- [x] Static site generation (SSG)
- [x] Server-side rendering (SSR)
- [x] [Incremental static regeneration (ISR)](/inner_workings/isr)
- [x] [Incremental static regeneration (ISR)](/inner_workings/caching)
- [x] Middleware
- [x] Image optimization
- [x] [NextAuth.js](https://next-auth.js.org)
- [x] [Running at edge](/advanced/architecture#running-at-edge)
- [x] [No cold start](/inner_workings/warming)
- [x] [Running in lambda@edge](/config/simple_example#running-in-lambdaedge)
- [x] [No cold start](/inner_workings/components/warmer)
- [x] Experimental streaming support

## Who is using OpenNext?
Expand Down
Loading