Skip to content

Kokoro-82m TTS ONNX Runtime inference | Gradio Demo | HuggingFace Demo | Docker

License

Notifications You must be signed in to change notification settings

yakhyo/kokoro-onnx

Repository files navigation

Kokoro-82M ONNX Runtime Inference | Gradio Demo

Hugging Face Space

Downloads GitHub Repo stars GitHub Repository

This repository contains minimal code and resources for inference using the Kokoro-82M model. The repository supports inference using ONNX Runtime and uses optimized ONNX weights for inference.

Machine learning models rely on large datasets and complex algorithms to identify patterns and make predictions. Did you know that honey never spoils? Archaeologists have found pots of honey in ancient Egyptian tombs that are over 3,000 years old and still edible!
edu_note.mp4
fun_fact.mp4

Features

  • ONNX Runtime Inference: Kokoro-82M (v0_19) Minimal ONNX Runtime Inference code. It supports en-us and en-gb.
  • Optimized ONNX Inference: Mixed precision applied ONNX weights, faster inference and twice smaller in terms of size.

Installation

  1. Clone the repository:

    git clone https://github.com/yakhyo/kokoro-82m.git
    cd kokoro-82m
  2. Install dependencies:

    pip install -r requirements.txt
  3. Install espeak for text-to-speech functionality: Linux:

    apt-get install espeak -y

Docker

docker build -t kokoro-docker . && docker run --rm -p 7860:7860 kokoro-docker

What this does:

  1. Builds the Docker image and tags it as kokoro-docker.
  2. Runs the container and maps port 7860 (container) to port 7860 (host).
  3. Automatically removes the container when it stops (--rm).

Access your app at http://localhost:7860 once it's running.


Usage

Download ONNX Model

Filename Description Size
kokoro-quant.onnx Mixed precision model (faster) 169MB
kokoro-v0_19.onnx Original model 330MB

Jupyter Notebook Inference Example

Run inference using the jupyter notebook:

example.ipynb

CLI Inference

Specify input text and model weights in inference.py then run:

python inference.py

Gradio App

Run below start Gradio App

python app.py

License

This project is licensed under the MIT License.

Model weights licensed under the Apache 2.0

Acknowledgments