Skip to content

RecoveredApparatus/exui

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

46 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ExUI

This is a simple, lightweight browser-based UI for running local inference using ExLlamaV2.

Overview of features

  • Friendly, responsive and minimalistic UI
  • Persistent sessions
  • Multiple instruct formats
  • Speculative decoding
  • Supports EXL2, GPTQ and FP16 models

Screenshots

chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot chat_screenshot

Running locally

First, clone this repository and install requirements:

git clone https://github.com/turboderp/exui
cd exui
pip install -r requirements.txt

Then run the web server with the included server.py:

python server.py

Running locally with pre-made scripts

On windows:

run.cmd

On Linux:

start.sh

Running on Google Colab

Colab note for Exui is avalible [here]https://colab.research.google.com/github/turboderp/exui/blob/master/colab.ipynb

Your browser should automatically open on the default IP/port. Config and sessions are stored in ~/exui by default.

Prebuilt wheels for ExLlamaV2 are available here. Installing the latest version of Flash Attention is recommended.

More to come

Stay tuned.

avatar_unicorn.png

About

Web UI for ExLlamaV2

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • JavaScript 45.1%
  • Python 29.9%
  • CSS 19.5%
  • HTML 3.6%
  • Jupyter Notebook 1.5%
  • Batchfile 0.2%
  • Shell 0.2%