Using KoboldCpp
Learn how to set up and configure KoboldCpp as a local AI backend for Arcania
You can find the full KoboldCpp documentation here.
Installation Steps
-
Clone the repo:
git clone https://github.com/LostRuins/koboldcpp cd koboldcpp
-
Download the model: For example, we will use OpenChat 3.5 model, which is what is used on the demo instance. There are many models to choose from.
Navigate to TheBloke/openchat_3.5-GGUF and download one of the models, such as
openchat_3.5.Q5_K_M.gguf
. Place this file inside the./models
directory. -
Build KoboldCpp:
make
-
Run the server:
./koboldcpp.py ./models/openchat_3.5.Q5_K_M.gguf
Configuration
-
Select KoboldCpp as the backend:
settings -> ChatBot -> ChatBot Backend -> KoboldCpp
-
Configure KoboldCpp:
settings -> ChatBot -> KoboldCpp
-
Enable extra features: Inside of "Use KoboldCpp" ensure that "Use Extra" is enabled. This will allow you to use the extra features of KoboldCpp, such as streaming.