Compare commits

..

3 Commits

Author SHA1 Message Date
Myx
ddf5bfdd3d Working LLM chatbot 2024-06-01 23:19:53 +02:00
Myx
a703f7f9a2 Clean 2024-06-01 12:59:45 +02:00
Myx
87b35a2203 Working chatbot 2024-06-01 12:34:03 +02:00
5 changed files with 21 additions and 61 deletions

View File

@@ -7,13 +7,10 @@ on:
jobs: jobs:
build: build:
runs-on: ubuntu-latest runs-on: windows-latest
steps: steps:
- name: Checkout - uses: actions/checkout@v2
uses: actions/checkout@v2
with:
fetch-depth: 0 # required for github-action-get-previous-tag
- name: Setup .NET - name: Setup .NET
uses: actions/setup-dotnet@v1 uses: actions/setup-dotnet@v1
@@ -30,19 +27,15 @@ jobs:
run: dotnet publish ./Bot/Lunaris2.csproj --configuration Release --output ./out run: dotnet publish ./Bot/Lunaris2.csproj --configuration Release --output ./out
- name: Zip the build - name: Zip the build
run: 7z a -tzip ./out/Lunaris.zip ./out/* run: 7z a -tzip ./out/Bot.zip ./out/*
- name: Get previous tag - name: Get the tag name
id: previoustag id: get_tag
uses: 'WyriHaximus/github-action-get-previous-tag@v1' run: echo "::set-output name=tag::${GITHUB_REF#refs/tags/}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Get next minor version - name: Get the version
id: semver id: get_version
uses: 'WyriHaximus/github-action-next-semvers@v1' run: echo "::set-output name=version::$(date +%s).${{ github.run_id }}"
with:
version: ${{ steps.previoustag.outputs.tag }}
- name: Create Release - name: Create Release
id: create_release id: create_release
@@ -50,8 +43,8 @@ jobs:
env: env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # This token is provided by Actions, you do not need to create your own token
with: with:
tag_name: ${{ steps.semver.outputs.patch }} tag_name: ${{ steps.get_version.outputs.version }}
release_name: Release ${{ steps.semver.outputs.patch }} release_name: Release v${{ steps.get_version.outputs.version }}
draft: false draft: false
prerelease: false prerelease: false
@@ -62,6 +55,6 @@ jobs:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with: with:
upload_url: ${{ steps.create_release.outputs.upload_url }} upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./out/Lunaris.zip asset_path: ./out/Bot.zip
asset_name: Lunaris.zip asset_name: Bot.zip
asset_content_type: application/zip asset_content_type: application/zip

View File

@@ -1,8 +0,0 @@
## Ollama - Large Language Model Chat - Handler
This handler "owns" the logic for accessing the ollama api, which runs the transformer model.
> How to get started with a local chat bot see: [Run LLMs Locally using Ollama](https://marccodess.medium.com/run-llms-locally-using-ollama-8f04dd9b14f9)
Assuming you are on the same network as the Ollama server you should configure it to be accessible to other machines on the network, however this is only required if you aren't running it from localhost relative to the bot.
See: [How do I configure Ollama server?](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server)

View File

@@ -2,14 +2,9 @@
```mermaid ```mermaid
flowchart TD flowchart TD
Program[Program] -->|Register| EventListener Program[Program] -->|Register| EventListener
EventListener[DiscordEventListener] --> A[MessageReceivedHandler] EventListener[DiscordEventListener] --> A
EventListener[DiscordEventListener] --> A2[SlashCommandReceivedHandler] A[MessageReceivedHandler] -->|Message| C{Send to correct command by
A --> |Message| f{If bot is mentioned}
f --> |ChatCommand| v[ChatHandler]
A2[SlashCommandReceivedHandler] -->|Message| C{Send to correct command by
looking at commandName} looking at commandName}
C -->|JoinCommand| D[JoinHandler] C -->|JoinCommand| D[JoinHandler]
@@ -17,33 +12,9 @@ flowchart TD
C -->|HelloCommand| F[HelloHandler] C -->|HelloCommand| F[HelloHandler]
C -->|GoodbyeCommand| G[GoodbyeHandler] C -->|GoodbyeCommand| G[GoodbyeHandler]
``` ```
Program registers an event listener ```DiscordEventListener``` which publish a message : Program registers an event listener ```DiscordEventListener``` which publish a message :
```c# ```c#
await Mediator.Publish(new MessageReceivedNotification(arg), _cancellationToken); await Mediator.Publish(new MessageReceivedNotification(arg), _cancellationToken);
``` ```
|Name| Description |
|--|--|
| SlashCommandReceivedHandler | Handles commands using ``/`` from any Discord Guild/Server. |
| MessageReceivedHandler| Listens to **all** messages. |
## Handler integrations
```mermaid
flowchart TD
D[JoinHandler] --> Disc[Discord Api]
E[PlayHandler] --> Disc[Discord Api]
F[HelloHandler] --> Disc[Discord Api]
G[GoodbyeHandler] --> Disc[Discord Api]
v[ChatHandler] --> Disc[Discord Api]
v --> o[Ollama Server]
o --> v
E --> Lava[Lavalink]
```
|Name| Description |
|--|--|
| JoinHandler| Handles the logic for **just** joining a voice channel. |
| PlayHandler| Handles the logic for joining and playing music in a voice channel. |
| HelloHandler| Responds with Hello. (Dummy handler, will be removed)|
| GoodbyeHandler| Responds with Goodbye. (Dummy handler, will be removed)|
| ChatHandler| Handles the logic for LLM chat with user. |

View File

@@ -9,7 +9,7 @@
"Token": "discordToken", "Token": "discordToken",
"LavaLinkPassword": "youshallnotpass", "LavaLinkPassword": "youshallnotpass",
"LavaLinkHostname": "127.0.0.1", "LavaLinkHostname": "127.0.0.1",
"LavaLinkPort": 2333, "LavaLinkPort": 2333
"LLM": { "LLM": {
"Url": "http://192.168.50.54:11434", "Url": "http://192.168.50.54:11434",
"Model": "gemma" "Model": "gemma"

View File

@@ -25,3 +25,7 @@ Lunaris2 is a Discord bot designed to play music in your server's voice channels
## Contributing ## Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change. Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
## License
[MIT](https://choosealicense.com/licenses/mit/)