You own little LLM in your matrix chatroom
Find a file
2023-06-12 21:35:51 +02:00
.vscode 🧑‍💻 set PYTHONPATH in vscode integrated terminal 2023-06-12 19:48:14 +02:00
nio_llm enable environment variable and force positional arguments in jsonsargparse CLI 2023-06-12 21:12:21 +02:00
.editorconfig 🔧 .editorconfig 2023-05-18 19:03:19 +02:00
.envrc 💪 nixify project 2023-05-18 19:04:30 +02:00
.gitattributes 🙈 .gitattributes 2023-05-18 19:03:49 +02:00
.gitignore 🙈 ignore .ruff_cache 2023-05-20 21:24:47 +02:00
flake.lock ⬆️ nix flake update 2023-06-12 21:35:51 +02:00
flake.nix 🎨 format flake.nix 2023-05-18 19:07:51 +02:00
LICENSE 📄 add MIT license 2023-05-24 22:42:13 +02:00
poetry.lock ♻️ switch from click to jsonargparse 2023-06-12 19:48:28 +02:00
pyproject.toml 🔧 pyproject add build-system 2023-06-12 21:35:32 +02:00
README.md 📝 add Installation and Usage instructions 2023-06-12 21:17:52 +02:00

Nio LLM

GitHub Code style: black Ruff

You own little LLM in your matrix chatroom.

Installation

pip install git+https://github.com/Laurent2916/nio-llm.git

Usage

This project uses jsonargparse to help with the command line arguments.

To see the available options, run:

nio_llm --help

To run the bot, you can either use command line arguments, environment variables or a config file. (or a mix of all three)

Command line arguments

nio_llm \
  # required \
  --room <YOUR ROOM> \
  --password <YOUR PASSWORD> \
  --username <YOUR USERNAME> \
  --preprompt <YOUR PREPROMPT> \
  # optional \
  --device-id nio-llm \
  --homeserver https://matrix.org \
  --ggml-repoid TheBloke/stable-vicuna-13B-GGML \
  --ggml-filename stable-vicuna-13B.ggmlv3.q5_1.bin \
  --sync-timeout 30000

Environment variables

# required
export NIO_LLM_ROOM=<YOUR ROOM>
export NIO_LLM_PASSWORD=<YOUR PASSWORD>
export NIO_LLM_USERNAME=<YOUR USERNAME>
export NIO_LLM_PREPROMPT=<YOUR PREPROMPT>

# optional
export NIO_LLM_DEVICE_ID=nio-llm
export NIO_LLM_HOMESERVER=https://matrix.org
export NIO_LLM_GGML_REPOID=TheBloke/stable-vicuna-13B-GGML
export NIO_LLM_GGML_FILENAME=stable-vicuna-13B.ggmlv3.q5_1.bin
export NIO_LLM_SYNC_TIMEOUT=30000

nio_llm

Config file

Create a config file with the following content:

# config_file.yaml

# required
room: <YOUR ROOM>
password: <YOUR PASSWORD>
username: <YOUR USERNAME>
preprompt: <YOUR PREPROMPT>

# optional
device_id: nio-llm
homeserver: https://matrix.org
ggml_repoid: TheBloke/stable-vicuna-13B-GGML
ggml_filename: stable-vicuna-13B.ggmlv3.q5_1.bin
sync_timeout: 30000

Then run:

nio_llm --config config_file.yaml

Special thanks