Skip to content

Commit b14dd98

Browse files
committed
chore: Bump version
1 parent 29b6e9a commit b14dd98

File tree

2 files changed

+7
-1
lines changed

2 files changed

+7
-1
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,12 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
77

88
## [Unreleased]
99

10+
## [0.2.68]
11+
12+
- feat: Update llama.cpp to ggerganov/llama.cpp@
13+
- feat: Add option to enable flash_attn to Lllama params and ModelSettings by @abetlen in 22d77eefd2edaf0148f53374d0cac74d0e25d06e
14+
- fix(ci): Fix build-and-release.yaml by @Smartappli in #1413
15+
1016
## [0.2.67]
1117

1218
- fix: Ensure image renders before text in chat formats regardless of message content order by @abetlen in 3489ef09d3775f4a87fb7114f619e8ba9cb6b656

llama_cpp/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
from .llama_cpp import *
22
from .llama import *
33

4-
__version__ = "0.2.67"
4+
__version__ = "0.2.68"

0 commit comments

Comments
 (0)