Skip to content

Actions: fairydreaming/llama.cpp

flake8 Lint

Actions

Loading...
Loading

Show workflow options

Create status badge

Loading
44 workflow run results
44 workflow run results

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

llama : add model types for various T5 variants
flake8 Lint #44: Commit 2245d19 pushed by fairydreaming
June 20, 2024 19:05 22s t5
t5
June 20, 2024 19:05 22s
Merge remote-tracking branch 'origin/master' into t5
flake8 Lint #43: Commit b3e4332 pushed by fairydreaming
June 20, 2024 18:16 1m 25s t5
t5
June 20, 2024 18:16 1m 25s
common: fix warning (#8036)
flake8 Lint #42: Commit abd894a pushed by fairydreaming
June 20, 2024 17:45 23s master
June 20, 2024 17:45 23s
t5
June 20, 2024 17:31 21s
llama : whitespace formatting fixes
flake8 Lint #40: Commit fe6c942 pushed by fairydreaming
June 20, 2024 16:12 22s t5
t5
June 20, 2024 16:12 22s
t5
June 20, 2024 16:09 16s
t5
June 20, 2024 11:11 17s
t5
June 20, 2024 07:10 32s
common : include llama_encode() in model warm-up sequence
flake8 Lint #36: Commit 9267a13 pushed by fairydreaming
June 20, 2024 07:03 23s t5
t5
June 20, 2024 07:03 23s
t5
June 19, 2024 19:22 18s
t5
June 19, 2024 19:09 22s
llama : proper handling of batches, support for multiple sequences
flake8 Lint #33: Commit 205fee3 pushed by fairydreaming
June 17, 2024 09:41 19s t5
t5
June 17, 2024 09:41 19s
llama : add method for getting padding token id
flake8 Lint #32: Commit 442e276 pushed by fairydreaming
June 14, 2024 12:07 19s t5
t5
June 14, 2024 12:07 19s
convert-hf : correct wrong variable name
flake8 Lint #31: Commit b885765 pushed by fairydreaming
June 14, 2024 11:52 17s t5
t5
June 14, 2024 11:52 17s
t5
June 14, 2024 10:53 17s
t5
June 14, 2024 07:57 19s
Add initial support for T5ForConditionalGeneration.
flake8 Lint #28: Commit 3d96a66 pushed by fairydreaming
June 13, 2024 20:08 19s t5
t5
June 13, 2024 20:08 19s
rpc : fix ggml_backend_rpc_supports_buft() (#7918)
flake8 Lint #27: Commit 172c825 pushed by fairydreaming
June 13, 2024 19:53 17s t5
t5
June 13, 2024 19:53 17s
rpc : fix ggml_backend_rpc_supports_buft() (#7918)
flake8 Lint #26: Commit 172c825 pushed by fairydreaming
June 13, 2024 19:21 31s master
June 13, 2024 19:21 31s
server : do not get prompt in infill mode (#7286)
flake8 Lint #25: Commit a5cabd7 pushed by fairydreaming
June 7, 2024 08:19 3m 22s master
June 7, 2024 08:19 3m 22s
gguf-py, llama : whitespace formatting fixes
flake8 Lint #24: Commit 3efb659 pushed by fairydreaming
May 28, 2024 11:30 19s deepseek-v2
May 28, 2024 11:30 19s
convert-hf : fix flake8 Lint errors
flake8 Lint #22: Commit bde971a pushed by fairydreaming
May 27, 2024 16:29 18s deepseek-v2
May 27, 2024 16:29 18s