Skip to content

Actions: fairydreaming/llama.cpp

Python check requirements.txt

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
73 workflow runs
73 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

llama : inference support for FLAN-T5 model family
Python check requirements.txt #23: Commit dae5b79 pushed by fairydreaming
June 23, 2024 09:02 3m 24s t5
t5
June 23, 2024 09:02 3m 24s
gguf-py, convert-hf : add model conversion support for T5ForCondition…
Python check requirements.txt #22: Commit da4f661 pushed by fairydreaming
June 21, 2024 13:26 3m 34s t5-clean
June 21, 2024 13:26 3m 34s
Merge remote-tracking branch 'origin/master' into t5
Python check requirements.txt #21: Commit 8bd8993 pushed by fairydreaming
June 21, 2024 11:40 16m 11s t5
t5
June 21, 2024 11:40 16m 11s
vulkan: detect multiple devices by deviceUUID instead of deviceID (#8…
Python check requirements.txt #20: Commit 557b653 pushed by fairydreaming
June 21, 2024 11:35 4m 38s master
June 21, 2024 11:35 4m 38s
Merge remote-tracking branch 'origin/master' into t5
Python check requirements.txt #19: Commit b3e4332 pushed by fairydreaming
June 20, 2024 18:16 3m 23s t5
t5
June 20, 2024 18:16 3m 23s
common: fix warning (#8036)
Python check requirements.txt #18: Commit abd894a pushed by fairydreaming
June 20, 2024 17:45 3m 32s master
June 20, 2024 17:45 3m 32s
llama : add llama_model_decoder_start_token() API call that returns d…
Python check requirements.txt #17: Commit e7bd870 pushed by fairydreaming
June 19, 2024 19:09 3m 22s t5
t5
June 19, 2024 19:09 3m 22s
convert-hf : correct wrong variable name
Python check requirements.txt #16: Commit b885765 pushed by fairydreaming
June 14, 2024 11:52 3m 20s t5
t5
June 14, 2024 11:52 3m 20s
llama, convert-hf : correct tensor dimensions to support T5WithLMHead…
Python check requirements.txt #15: Commit 2bd023d pushed by fairydreaming
June 14, 2024 07:57 3m 18s t5
t5
June 14, 2024 07:57 3m 18s
Add initial support for T5ForConditionalGeneration.
Python check requirements.txt #14: Commit 3d96a66 pushed by fairydreaming
June 13, 2024 20:08 3m 18s t5
t5
June 13, 2024 20:08 3m 18s
rpc : fix ggml_backend_rpc_supports_buft() (#7918)
Python check requirements.txt #13: Commit 172c825 pushed by fairydreaming
June 13, 2024 19:21 3m 47s master
June 13, 2024 19:21 3m 47s
server : do not get prompt in infill mode (#7286)
Python check requirements.txt #12: Commit a5cabd7 pushed by fairydreaming
June 7, 2024 08:19 3m 22s master
June 7, 2024 08:19 3m 22s
llama : replace ggml_new_tensor_3d + ggml_set_inplace + ggml_set_inpl…
Python check requirements.txt #11: Commit 841cd47 pushed by fairydreaming
May 28, 2024 09:21 3m 26s deepseek-v2
May 28, 2024 09:21 3m 26s
convert-hf : fix flake8 Lint errors
Python check requirements.txt #10: Commit bde971a pushed by fairydreaming
May 27, 2024 16:29 3m 17s deepseek-v2
May 27, 2024 16:29 3m 17s
Merge remote-tracking branch 'upstream/master' into deepseek-v2
Python check requirements.txt #9: Commit a54685b pushed by fairydreaming
May 24, 2024 14:14 3m 25s deepseek-v2
May 24, 2024 14:14 3m 25s
llama : fix whitespace formatting
Python check requirements.txt #8: Commit 602c80d pushed by fairydreaming
May 24, 2024 10:52 4m 9s snowflake-arctic-clean
May 24, 2024 10:52 4m 9s
Merge remote-tracking branch 'upstream/master' into snowflake-arctic
Python check requirements.txt #7: Commit 759dd3e pushed by fairydreaming
May 24, 2024 08:30 4m 19s snowflake-arctic
May 24, 2024 08:30 4m 19s
convert-hf : remove extra space
Python check requirements.txt #6: Commit d5f4c76 pushed by fairydreaming
May 23, 2024 14:45 3m 25s rinna-gpt-neox
May 23, 2024 14:45 3m 25s
convert-hf : add support for SentencePiece vocab in GPTNeoXForCausalL…
Python check requirements.txt #5: Commit a3c9d52 pushed by fairydreaming
May 23, 2024 13:29 3m 35s rinna-gpt-neox
May 23, 2024 13:29 3m 35s
llama : Replaced obsolete ggml_rope_custom() calls with ggml_rope_ext().
Python check requirements.txt #4: Commit a1a5508 pushed by fairydreaming
May 22, 2024 14:11 3m 30s snowflake-arctic-clean
May 22, 2024 14:11 3m 30s
llama : Replaced obsolete ggml_rope_custom() calls with ggml_rope_ext().
Python check requirements.txt #3: Commit bf8cd28 pushed by fairydreaming
May 22, 2024 13:53 3m 33s snowflake-arctic
May 22, 2024 13:53 3m 33s
Added missing support for GPTNeoXForCausalLM (Pythia and GPT-NeoX bas…
Python check requirements.txt #2: Commit 720e886 pushed by fairydreaming
May 22, 2024 10:10 20m 36s gpt-neox
May 22, 2024 10:10 20m 36s
CUDA: remove incorrect precision check (#7454)
Python check requirements.txt #1: Commit 95fb0ae pushed by fairydreaming
May 22, 2024 09:10 1h 6m 6s master
May 22, 2024 09:10 1h 6m 6s