From f8e68545698f09b6327656f026dd671c2b33879f Mon Sep 17 00:00:00 2001 From: SocksOnHead Date: Sun, 2 Jun 2024 00:04:14 +0200 Subject: [PATCH] Create readme.md --- Bot/Handler/ChatCommand/readme.md | 8 ++++++++ 1 file changed, 8 insertions(+) create mode 100644 Bot/Handler/ChatCommand/readme.md diff --git a/Bot/Handler/ChatCommand/readme.md b/Bot/Handler/ChatCommand/readme.md new file mode 100644 index 0000000..5c26fa6 --- /dev/null +++ b/Bot/Handler/ChatCommand/readme.md @@ -0,0 +1,8 @@ +## Ollama - Large Language Model Chat - Handler + +This handler "owns" the logic for accessing the ollama api, which runs the transformer model. + +> How to get started with a local chat bot see: [Run LLMs Locally using Ollama](https://marccodess.medium.com/run-llms-locally-using-ollama-8f04dd9b14f9) + +Assuming you are on the same network as the Ollama server you should configure it to be accessible to other machines on the network, however this is only required if you aren't running it from localhost relative to the bot. +See: [How do I configure Ollama server?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server)