The Dashboard is the interface for managing all Fleek platform services, which includes site deployments, functions and storage.
- 🎮 Environment Setup
- 🤖 Install
- 👷♀️Development
- ⚡️ Performance
- 💍 Tests
- 🛠️Generators
- 🖍️Component Library
- 🕷️Migration processes
- 🚀 Release to Production
- 📖 Docs
- 🙏 Contributing
- ⏱️ Changelog
- Nodejs as runtime
- NPM, Yarn to install the CLI as a client, or PNPM for development
- Familiarity with text-based user interfaces, command-line interface (CLI)
- Ports: UI (declared as NEXT_DEV_SERVER_PORT), Storybook (6006).
You'll also need to setup the development environment.
For developers looking to contribute to the User Dashboard, clone the repository and follow the contribution guide.
Once cloned, you'll have to set up the local development environment, e.g. to have access to the source-code, iterate, run tests and much more.
For runtime we utilize Nodejs and PNPM as the package manager.
Create a new file named .env in the root directory of your project. This file will store environment variables needed for local development.
touch .env.development
Open the .env.development file in a text editor and add the following:
NEXT_DEV_SERVER_PORT="3001"
NEXT_PUBLIC_SDK__AUTHENTICATION_URL="https://graphql.service.fleek.xyz/graphql"
NEXT_PUBLIC_UI_FLEEK_REST_API_URL="https://api.fleek.xyz"
NEXT_PUBLIC_UI__DYNAMIC_ENVIRONMENT_ID="de23a5f0-aaa5-412e-8212-4fb056a3b30d"
NEXT_PUBLIC_UI__GTM_ID="GTM-5RC2N5H"
NEXT_PUBLIC_UI__POSTHOG_HOST="https://us.i.posthog.com"
NEXT_PUBLIC_UI__POSTHOG_KEY="phc_RJhBMFHIZxwd361q6q9LZxDvSAta0F56mXQo3An307y"
NEXT_PUBLIC_UI__SITE_SLUG_DOMAIN="on-fleek.app"
NEXT_PUBLIC_UI__STRIPE_PUBLIC_KEY="dummy"
NEXT_PUBLIC_UI__ZENDESK_PROXY_API="https://support-prod-eu-lon-1-01.flkservices.io"
NEXT_PUBLIC_ZENDESK_PROXY_HOSTNAME="support-prod-eu-lon-1-01.flkservices.io"
NEXT_PUBLIC_UI__UPLOAD_PROXY_API_URL="https://uploads.service.fleek.xyz",
NEXT_PUBLIC_UI__INTERNAL_IPFS_STORAGE_HOSTNAME="storage-ipfs.internal.fleek.xyz"
NEXT_PUBLIC_WEBSITE_URL="https://fleek.xyz"
NEXT_PUBLIC_DASHBOARD_BASE_PATH="/dashboard"
NEXT_PUBLIC_AGENTS_AI_PATH="/eliza"
NEXT_PUBLIC_ALLOW_LANDING_PAGE_LOGIN="true"
NEXT_PUBLIC_BILLING_FREE_PLAN_DEPRECATION_DATE="2025-04-17"
NEXT_PUBLIC_UI__COMMIT_HASH="dev.hash"
💡 The variables above point to our production environment, the same you interact with as an end-user. Internal development team have access to private environments. Because the environment most users have access to is production which mismatches the filename .env.development this can be replaced by .env
if that's sounder to you.
Warning
Set the NODE_ENV variable to select the corresponding environment file (.env*), e.g. NODE_ENV="production" would read the file .env.production Keep it simple, name the file to the corresponding environment like .env.<NODE_ENV> The test runner ignores .env.local.*
Important
The build process requires the environment variables to be populated. If you're building the project locally, you should create a .env.production, otherwise it'll fail to locate the environment variables.
Test specific environment variables must be setup in the location .tests/.env
.
NEXT_DEV_SERVER_PORT=3001
UI_TEST_HTTP_SERVER_PORT=3001
UI_TEST_DEV_SERVER_MODE=""
UI_TEST_DEV_SERVER_STATIC_PATH="out"
UI_TEST_DEV_SERVER_HOSTNAME="localhost"
To set the E2E tests to use the static build, set the UI_TEST_DEV_SERVER_MODE to "build". Leave empty to default to the next dev server.
The "build" mode can be useful for users on lower specification machines, e.g. test timeouts. Unfortunately, the Nextjs library dev server consume a lot of resources.
When using the "build" mode, a new build has to be processed for every source-code change. Otherwise, you'll be testing the wrong source-code output version.
Consequently, due to the amount of time the build process takes to complete, it's not suitable for testing continuous contributions.
UI_TEST_DEV_SERVER_MODE="build"
It's recommended to use the default port 3001. Due to intercepting network calls (mocking) and reusability of call data information across contributors. For this reason, when opting for "build" mode, set the UI_TEST_HTTP_SERVER_PORTas 3001. You can use a different port for the nextjs dev server, e.g. 1234.
UI_TEST_HTTP_SERVER_PORT=3001
UI_TEST_DEV_SERVER_MODE="build"
NEXT_DEV_SERVER_PORT=1234
Warning
To use the default nextjs for testing, you must update the NEXT_DEV_SERVER_PORT to the recommended default port number 3001. Disable the "build" mode and you must restart the servers.
Note
As a result of the "build" processing time, which's long, build requests have to be executed manually. Learn how to build here
When the UI test static server's running, there might be cases you want to shut it down, e.g. free up the port 3001 for some other process.
To terminate the UI test static server gracefully by running:
pnpm test:terminate_http_server
Now that you've learned to setup the development environment, you can proceed to install the project dependencies.
Start by installing the project dependencies:
pnpm i
Run a local development server by executing the command:
pnpm dev
Note
The project's built with Nextjs, that might be familiar to you.
It'll try to start the development server. Once ready, you should get a local address in the output.
For example, let's say it was bound to the default port 3000, you'd get:
- Local: http://localhost:3000
Warning
Consider the configured port declared as NEXT_DEV_SERVER_PORT If the port 3000 is not free on execution a different port's utilized. Check the output for the correct address, please!
Open the address http://localhost:3000 in your favourite development browser.
Buil the project by executing the command:
pnpm run build
Our build process outputs static files to:
out
Warning
The build process requires the environment variables to be populated. If you're building the project locally, you should create a .env.production, otherwise it'll fail to locate the environment variables.
The output directory is where all public files are stored and published.
Optionally, if you need to override the environment variable defined values create a defined_overrides.json
file in the public directory.
touch public/defined_overrides.json
Declare any overrides for src/defined.ts as follows:
NEXT_PUBLIC_WEBSITE_URL="https://custom.hostname"
You may only find this useful to control the prebuild distribution package, e.g. when hosting the application alongside other, such as website where you'd need to control the environment to play along. You'd also have to put the file in the distribution files.
To preview the build locally, you can run the command:
pnpm preview
A HTTP server will serve the build output.
Formatting and linting are facilitated by BiomeJS. Configuration details can be found in:
biome.json
To format source code and apply changes directly in the file:
pnpm format
For checking source code formatting only:
pnpm format:check
To lint and apply changes directly in the file:
pnpm lint
For lint checks only:
pnpm lint:check
To both format and lint source code (with writes):
pnpm format:unsafe
Manage the versioning of changelog entries.
Declare an intent to release by executing the command and answering the wizard's questions:
pnpm changeset:add
Since npm link is a command-line tool for symlinking a local package as a dependency during development. It is commonly used for testing packages before publishing them. But it's common to cause confusion and unexpected behaviour.
Instead of using pnpm link
for local package testing, use the following command, that's closer to release install.
pnpm generate:local_package
Once successful, the console will display an install command that you can copy and run in your project.
Here's an example that uses npm:
npm i --no-save <GENERATED_FILE_PATH>
Warning
Remove concurrent package name from package.json, e.g. @fleek-platform/dashboard. The local install doesn't save or modify the package.json. The package.json and lockfiles are only for existing registry versions. You might have to run the clean command to remove any conflicting packages from node_modules, locks, etc.
Alternatively, if you're using an npm-compatible package manager like pnpm, avoid saving or modifying the lock file, e.g:
npm_config_save=false npm_config_lockfile=false pnpm i <GENERATED_FILE_PATH>
Another option is to use the GitHub registry to push any packages you might want to test. Find more about it here.
Regression in software testing refers to when a previously working feature stops working after new changes are made. When contributing make sure that changes are healthy and cause issues.
Learn how to run and write tests here.
This table shows the corresponding deployment URLs for each branch. Note that the availability and state of each URL depends on successful CI/CD build and deployment.
Branch | URL
----------------------------------
develop (latest) | https://fleek-dashboard-staging.fleeksandbox.xyz
develop (storybook) | https://fleek-dashboard-storybook.on-fleek.app
main | https://fleek-dashboard-production.on-fleek.app
To enable support for redirects, single-page-applications, custom 404 pages on IPFS-backed hosting, we use f the _redirects.
Also, set the environment variable NEXT_PUBLIC_BASE_PATH
, as the base forward path in _redirects
:
/* /dashboard/index.html 200
The following are performance tools to help identify rendering bottlenecks and component inefficiencies.
We use react-scan to help detect performance issues in the dashboard (react application). The process will help identify unnecessary renders that cause the application to become slow. Learn more here.
To spin up an interactive and isolated browser instance run the following command:
pnpm run perf:scan
The project has two category of tests:
- End-to-End (e2e) built on playwright to facility testing the UI/UX interface. The network calls are mocked to facilitate rapid development and focus on the interface;
- Unit tests, which assert pure functions, e.g. data transformations, calculations, etc, a separate concern over the presentation;
You can run all tests by executing the command:
pnpm run test
Alternatively, you can inspect the available tests in the package.json scripts section.
For example, you can launch end-to-end tests, on a chrome browser:
pnpm run test:ui
Run the E2E test suite
pnpm run test:e2e
Run the unit tests
pnpm run test:unit
Run the component function tests
pnpm run test:component
Here are some recommendations when writing tests.
- Block any unnecessary network requests, e.g., third-party services, such as analytics, etc
- Use headless mode for fast feedback
- Prepare and clean data or state for each test
- Prioritize user flows, e.g. prefer role selection instead of specifying implementation details such as ID, Class, or element names (DOM locators)
- Prefer tools provided by Playwright
- Tests should be independent and isolated, e.g. avoid sharing state across test cases depending on others
- Use the mode
Browser/UI
to see the tests in actions to see how the application render, inspect network logs, etc - Tests should have the ability to run concurrently, e.g., you'd rather have 5 workers computing instead of waiting sequentially to save you time
- Mock API calls providing the expected data structure in the response, to prioritize testing the interface, e.g., you're testing that the user-interface corresponds to the user-journey goals NOT the service-side availability
- Handle exceptions gracefully, e.g., test cases where the API response fails, etc
- Avoid writing tests for static elements that's useless to the end-user
- Avoid placing timeouts in the test
Unit tests should be small tests that check individual parts of code for correctness. Ideally, functions tested with unit tests should be:
- Pure, meaning their output depends only on their inputs, with no side effects. This makes them predictable and easier to test
- Each test should run independently, without relying on other tests
- Test One Thing Per Test: Focus each test on verifying a single behavior or functionality
- Replace external dependencies (e.g., databases) with mock objects
- Write simple, concise tests for easier debugging and maintenance, e.g. prefere storytelling
The terminology "Component Function tests" is used to describe unit tests for components. You can refer to it as unit-tests, but when communicating component functionality, it's our preference to refer to it as "Component Functional tests". It avoids confusion.
Warning
Do not confuse it with Component testing as promoted by Storybook's team. The Component Function tests at Fleek's Frontend has a very particular meaning in the context's they're used and serves to communicate efficiently.
Ideally, components should be tested in isolation (unit). Events or inputs should reproduce expected behaviour and output. Here's a break-down:
- Focus on Component Behavior
- Tests must be user-oriented, e.g. validate a component feature from a user's perspective
- Ensure a component work as intended before integration with other components or the application
- Avoid testing internal implementation details like state or private functions
- Identify and prioritize testing the most common and critical component behaviors
- Ensure all major user interactions and outcomes are tested
- Props should be as realistic as possible to reflect how the component will be used in production
- Test interactions (clicks, typing, form submissions) as closely as possible to real user actions
- Simulate user actions or lifecycle changes
- Ensure visual changes like showing/hiding elements are validated
- Test only one behavior or aspect per test case, e.g. avoid long tests that cover multiple, unrelated scenarios
- Do not rely on snapshots for dynamic or interactive components
- Ensure tests are fast enough for regular execution
- Update tests when refactoring or introducing new features
On CI/CD runners, low specs cause inconsistent runs to mitigate any inconsistency playwright's team recommends running in a single worker CI/Workers.
Note that launching a large, e.g. macos-latest-xlarge
runner is more efficient but increases cost dramatically.
Generate a sitemap by executing the command:
pnpm run generate:sitemap
The component library provides a collection of ready-to-use components. We use Storybook to showcase and document our components.
Start the storybook development server:
pnpm storybook
To build a static version:
pnpm storybook:build
When committing to develop branch, a new build is deployed to https://fleek-dashboard-storybook.on-app.xyz.
To assist with migrating from an outdated repository and updating our workflows, we've outlined the following processes.
The "sync from monorepo" process automates the transfer of source code files from the deprecated monorepo to the new repository. It only copies files created or changed in the range of commit hash from to the latest HEAD version. By using this process, you reduce the number of files to synchronize to only what has been changed. Reducing the number of files to verify, due to cases where the current repository might have progressed containing changes that do not exist in the source repository.
Before you start, check the original repository commit hash you want to pick changes from. The commit hash you choose is exclusive. It marks a point in history but doesn't include its files. To sync files, use the previous commit hash. This creates a range for selective file inclusion, allowing you to choose exactly which changes to apply.
pnpm run migrate:sync_from_monorepo
You can release to production following a linear strategy. This assumes that the convention "main" branch is of linear history and is a subset of the "develop" branch commit history. For example, the team is happy to have "develop" as where the latest version of the project exists, that "main" shouldn't diverge and only contain commits from "develop".
Use-case examples:
- The team has merged some feature branches into develop identified as commit hash "abc123" and want to release upto to the commit history hash "abc123" onto "main". By doing this they expect the build process to occur and deploy into the Fleek Platform
- The team has merged several feature branches into develop identified as commit hashes
commitFeat1
,commitFeat2
andcommitFeat3
by this historical order. It's decided to release everything in commit history untilcommitFeat1
, but notcommitFeat2
andcommitFeat3
. Although, it'd be wiser to keep the feature branches in pending state as "develop" should always be in a ready state for testing and release as the team may want to release some quick hotfixes, etc
To release to production open the actions tab here.
Select the "🚀 Release by develop hash" job in the left sidebar. Next, select the "Run workflow" drop-down and provide the required details.
This section guides you through the process of contributing to our open-source project. From creating a feature branch to submitting a pull request, get started by:
- Fork the project here
- Create your feature branch using our branching strategy, e.g.
git checkout -b feat/my-new-feature
- Run the tests:
pnpm test
- Commit your changes by following our commit conventions, e.g.
git commit -m 'chore: 🤖 my contribution description'
- Push to the branch, e.g.
git push origin feat/my-new-feature
- Create new Pull Request following the corresponding template guidelines
The develop branch serves as the main integration branch for features, enhancements, and fixes. It is always in a deployable state and represents the latest development version of the application.
Feature branches are created from the develop branch and are used to develop new features or enhancements. They should be named according to the type of work being done and the scope of the feature and in accordance with conventional commits here.
We prefer to commit our work following Conventional Commits conventions. Conventional Commits are a simple way to write commit messages that both people and computers can understand. It help us keep track fo changes in a consistent manner, making it easier to see what was added, changed, or fixed in each commit or update.
The commit messages are formatted as [type]/[scope] The type is a short descriptor indicating the nature of the work (e.g., feat, fix, docs, style, refactor, test, chore). This follows the conventional commit types.
The scope is a more detailed description of the feature or fix. This could be the component or part of the codebase affected by the change.
Here's an example of different conventional commits messages that you should follow:
test: 💍 Adding missing tests
feat: 🎸 A new feature
fix: 🐛 A bug fix
chore: 🤖 Build process or auxiliary tool changes
docs: 📝 Documentation only changes
refactor: 💡 A code change that neither fixes a bug or adds a feature
style: 💄 Markup, white-space, formatting, missing semi-colons...