Using Ollama
Learn how to set up and configure Ollama as a local AI backend for Arcania
⚠️ Dieser Inhalt ist noch nicht in Ihrer Sprache verfügbar.
You can find the full Ollama documentation here.
Installation Steps
Linux and WSL2
Mac OSX
Windows
Not yet supported
Setup Process
-
Start the server:
-
Download a model: For example, we will use Mistral 7B. There are many models to choose from listed in the library.
-
Enable the server in the client: