You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/content/docs/reference/compatibility-table.md
+10-14
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ weight = 24
6
6
url = "/model-compatibility/"
7
7
+++
8
8
9
-
Besides llama based models, LocalAI is compatible also with other architectures. The table below lists all the compatible models families and the associated binding repository.
9
+
Besides llama based models, LocalAI is compatible also with other architectures. The table below lists all the backends, compatible models families and the associated repository.
10
10
11
11
{{% alert note %}}
12
12
@@ -16,19 +16,8 @@ LocalAI will attempt to automatically load models which are not explicitly confi
16
16
17
17
| Backend and Bindings | Compatible models | Completion/Chat endpoint | Capability | Embeddings support | Token stream support | Acceleration |
|[bert](https://github.com/skeskinen/bert.cpp) ([binding](https://github.com/go-skynet/go-bert.cpp)) | bert | no | Embeddings only | yes | no | N/A |
19
+
|[llama.cpp]({{%relref "docs/features/text-generation#llama.cpp" %}}) | LLama, Mamba, RWKV, Falcon, Starcoder, GPT-2, [and many others](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#description)| yes | GPT and Functions | yes**| yes | CUDA, openCL, cuBLAS, Metal |
20
+
|[llama.cpp's ggml model (backward compatibility with old format, before GGUF)](https://github.com/ggerganov/llama.cpp) ([binding](https://github.com/go-skynet/go-llama.cpp)) | LLama, GPT-2, [and many others](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#description)| yes | GPT and Functions | yes**| yes | CUDA, openCL, cuBLAS, Metal |
32
21
|[whisper](https://github.com/ggerganov/whisper.cpp)| whisper | no | Audio | no | no | N/A |
33
22
|[stablediffusion](https://github.com/EdVince/Stable-Diffusion-NCNN) ([binding](https://github.com/mudler/go-stable-diffusion)) | stablediffusion | no | Image | no | no | N/A |
34
23
|[langchain-huggingface](https://github.com/tmc/langchaingo)| Any text generators available on HuggingFace through API | yes | GPT | no | no | N/A |
@@ -40,11 +29,18 @@ LocalAI will attempt to automatically load models which are not explicitly confi
40
29
|`diffusers`| SD,... | no | Image generation | no | no | N/A |
41
30
|`vall-e-x`| Vall-E | no | Audio generation and Voice cloning | no | no | CPU/CUDA |
42
31
|`vllm`| Various GPTs and quantization formats | yes | GPT | no | no | CPU/CUDA |
32
+
|`mamba`| Mamba models architecture | yes | GPT | no | no | CPU/CUDA |
43
33
|`exllama2`| GPTQ | yes | GPT only | no | no | N/A |
44
34
|`transformers-musicgen`|| no | Audio generation | no | no | N/A |
45
35
|[tinydream](https://github.com/symisc/tiny-dream#tiny-dreaman-embedded-header-only-stable-diffusion-inference-c-librarypixlabiotiny-dream)| stablediffusion | no | Image | no | no | N/A |
46
36
|`coqui`| Coqui | no | Audio generation and Voice cloning | no | no | CPU/CUDA |
37
+
|`openvoice`| Open voice | no | Audio generation and Voice cloning | no | no | CPU/CUDA |
38
+
|`parler-tts`| Open voice | no | Audio generation and Voice cloning | no | no | CPU/CUDA |
39
+
|[rerankers](https://github.com/AnswerDotAI/rerankers)| Reranking API | no | Reranking | no | no | CPU/CUDA |
47
40
|`transformers`| Various GPTs and quantization formats | yes | GPT, embeddings | yes | yes****| CPU/CUDA/XPU |
41
+
|[bark-cpp](https://github.com/PABannier/bark.cpp)| bark | no | Audio-Only | no | no | yes |
42
+
|[stablediffusion-cpp](https://github.com/leejet/stable-diffusion.cpp)| stablediffusion-1, stablediffusion-2, stablediffusion-3, flux, PhotoMaker | no | Image | no | no | N/A |
43
+
|[silero-vad](https://github.com/snakers4/silero-vad) with [Golang bindings](https://github.com/streamer45/silero-vad-go)| Silero VAD | no | Voice Activity Detection | no | no | CPU |
48
44
49
45
Note: any backend name listed above can be used in the `backend` field of the model configuration file (See [the advanced section]({{%relref "docs/advanced" %}})).
0 commit comments