Skip to content

Commit b8c4137

Browse files
authored
Update to v0.6.5 (#373)
1 parent 820e2e5 commit b8c4137

File tree

1 file changed

+36
-30
lines changed

1 file changed

+36
-30
lines changed

README.md

Lines changed: 36 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,10 @@
11
[<img src="flagopen.png">](https://flagopen.baai.ac.cn/)
22

33
## Latest News
4+
- **[2025/02]** Released [v0.6.5](https://github.com/FlagOpen/FlagScale/tree/release/v0.6.5):
5+
- Added support for DeepSeek-V3 distributed pre-training (beta) and [DeepSeek-V3/R1 serving](#deepseek-r1-serving) across multiple chips.
6+
- Introduced an auto-tuning feature for serving and a new CLI feature for one-click deployment.
7+
- Enhanced the CI/CD system to support more chips and integrated the workflow of [FlagRelease](https://huggingface.co/FlagRelease).
48
- **[2024/11]** Released [v0.6.0](https://github.com/FlagOpen/FlagScale/tree/release/v0.6.0):
59
- Introduced general multi-dimensional heterogeneous parallelism and CPU-based communication between different chips.
610
- Added the full support for LLaVA-OneVision, achieving SOTA results on the [Infinity-MM](https://arxiv.org/abs/2410.18558) dataset.
@@ -52,36 +56,6 @@ We recommend using the latest release of [NGC's PyTorch container](https://catal
5256
cp -r megatron-energon/src/megatron/energon megatron/megatron
5357
```
5458

55-
### DeepSeek-R1 Serve
56-
57-
We support the model serving of DeepSeek R1 and have implemented the flagscale serve command for one-click deployment.
58-
Only configure two YAML files, then use the `flagscale serve` command to serve.
59-
60-
1. Configure the yaml files:
61-
```
62-
Flagscale/
63-
├── examples/
64-
│ └── deepseek_r1/
65-
│ └── config_deepseek_r1.yaml # set hostfile
66-
│ └── serve/
67-
│ └── deepseek_r1.yaml # set model parameters and server port
68-
```
69-
2. Install FlagScale CLI:
70-
```
71-
cd FlagScale
72-
pip install .
73-
```
74-
75-
3. One-click serve:
76-
```
77-
flagscale serve deepseek_r1
78-
```
79-
80-
4. When custom service parameters, users can run:
81-
```
82-
flagscale serve <model_name> <MODEL_CONFIG_YAML>
83-
```
84-
8559
### Run a Task
8660

8761
FlagScale provides a unified runner for various tasks, including training,inference and serve. Simply specify the configuration file to run the task with a single command. The runner will automatically load the configurations and execute the task. The following example demonstrates how to run a distributed training task.
@@ -111,6 +85,38 @@ FlagScale provides a unified runner for various tasks, including training,infe
11185
```
11286
For more details, please refer to [Quick Start](./flagscale/serve/README.md).
11387

88+
### DeepSeek-R1 Serving <a name="deepseek-r1-serving"></a>
89+
90+
We support the model serving of DeepSeek R1 and have implemented the `flagscale serve` command for one-click deployment. By configuring just two YAML files, you can easily serve the model using the `flagscale serve` command.
91+
92+
1. **Configure the YAML files:**
93+
```
94+
Flagscale/
95+
├── examples/
96+
│ └── deepseek_r1/
97+
│ └── config_deepseek_r1.yaml # Set hostfile
98+
│ └── serve/
99+
│ └── deepseek_r1.yaml # Set model parameters and server port
100+
```
101+
102+
2. **Install FlagScale CLI:**
103+
```sh
104+
cd FlagScale
105+
pip install .
106+
```
107+
108+
3. **One-click serve:**
109+
```sh
110+
flagscale serve deepseek_r1
111+
```
112+
113+
4. **Custom service parameters:**
114+
```sh
115+
flagscale serve <MODEL_NAME> <MODEL_CONFIG_YAML>
116+
```
117+
118+
The configuration files allow you to specify the necessary parameters and settings for your deployment, ensuring a smooth and efficient serving process.
119+
114120
## License
115121

116122
This project is licensed under the [Apache License (Version 2.0)](https://github.com/FlagOpen/FlagScale/blob/main/LICENSE). This project also contains other third-party components under other open-source licenses. See the [LICENSE](https://github.com/FlagOpen/FlagScale/blob/main/LICENSE) file for more information.

0 commit comments

Comments
 (0)