You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Are there any plans to integrate an LLM into BookStack? It might sound like an insane, but it could be useful to locate information in one’s BookStack library or to refresh ones memory on what’s there. Think of using it for instance to create recaps of books or pages.
Describe the benefits this would bring to existing BookStack users
I'm collecting so much information and have so many projects going on, I find it increasingly difficult to keep track of my own recordings.
I assume many other have similar problems.
Can the goal of this request already be achieved via other means?
Only by manually browsing or searching for the right keywords.
Have you searched for an existing open/closed issue?
I have searched for existing issues and none cover my fundamental request
How long have you been using BookStack?
Over 5 years
Additional context
No response
The text was updated successfully, but these errors were encountered:
I started on a proof of concept of a somewhat native integration of LLM based search back in March.
My draft PR branch with research can be found in #5552.
I just added a video preview of my proof of concept in a comment there to provide some visuals: #5552 (comment)
There's quite a few questionables & considerations around this though.
It's on pause right now, while I crack on with the next feature release, but my plan was to come back to it to develop it out a little further after this current release cycle is done.
From what I've heard; The bigger LLMs are based on various calculations by using the GPU rather than the CPU. I actually currently don't even have a GPU in my homelab server.
It would be great if things like LLM is configurable, so that its not even in use in the first place - unless it is configured.
Hi @Kristoffeh,
We probably wouldn't ship models directly part of BookStack at all, and this would be something that's optional upon the default system due to the external requirements.
My current implementation uses OpenAI-like APIs, which others seem to support as a somewhat non-official standard.
The idea is you'd be able to integration with an external system of your choice which supports this API, including a self-hosted instance of something like Ollama using self-hosted models (potentially on another system) or just existing LLM services.
Hi @Kristoffeh, We probably wouldn't ship models directly part of BookStack at all, and this would be something that's optional upon the default system due to the external requirements.
My current implementation uses OpenAI-like APIs, which others seem to support as a somewhat non-official standard. The idea is you'd be able to integration with an external system of your choice which supports this API, including a self-hosted instance of something like Ollama using self-hosted models (potentially on another system) or just existing LLM services.
Hello again @ssddanbrown and thanks for taking the time.
Okay, that sounds great!
Describe the feature you'd like
Are there any plans to integrate an LLM into BookStack? It might sound like an insane, but it could be useful to locate information in one’s BookStack library or to refresh ones memory on what’s there. Think of using it for instance to create recaps of books or pages.
Describe the benefits this would bring to existing BookStack users
I'm collecting so much information and have so many projects going on, I find it increasingly difficult to keep track of my own recordings.
I assume many other have similar problems.
Can the goal of this request already be achieved via other means?
Only by manually browsing or searching for the right keywords.
Have you searched for an existing open/closed issue?
How long have you been using BookStack?
Over 5 years
Additional context
No response
The text was updated successfully, but these errors were encountered: