mirror of
https://github.com/Laurent2916/nio-llm.git
synced 2024-11-23 22:58:48 +00:00
📝 add Installation and Usage instructions
This commit is contained in:
parent
0479c69fbc
commit
1f368bf3d2
82
README.md
82
README.md
|
@ -6,6 +6,88 @@
|
||||||
|
|
||||||
You own little LLM in your matrix chatroom.
|
You own little LLM in your matrix chatroom.
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install git+https://github.com/Laurent2916/nio-llm.git
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
This project uses [jsonargparse](https://github.com/omni-us/jsonargparse/) to help with the command line arguments.
|
||||||
|
|
||||||
|
To see the available options, run:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nio_llm --help
|
||||||
|
```
|
||||||
|
|
||||||
|
To run the bot, you can either use command line arguments, environment variables or a config file. (or a mix of all three)
|
||||||
|
|
||||||
|
### Command line arguments
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nio_llm \
|
||||||
|
# required \
|
||||||
|
--room <YOUR ROOM> \
|
||||||
|
--password <YOUR PASSWORD> \
|
||||||
|
--username <YOUR USERNAME> \
|
||||||
|
--preprompt <YOUR PREPROMPT> \
|
||||||
|
# optional \
|
||||||
|
--device-id nio-llm \
|
||||||
|
--homeserver https://matrix.org \
|
||||||
|
--ggml-repoid TheBloke/stable-vicuna-13B-GGML \
|
||||||
|
--ggml-filename stable-vicuna-13B.ggmlv3.q5_1.bin \
|
||||||
|
--sync-timeout 30000
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment variables
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# required
|
||||||
|
export NIO_LLM_ROOM=<YOUR ROOM>
|
||||||
|
export NIO_LLM_PASSWORD=<YOUR PASSWORD>
|
||||||
|
export NIO_LLM_USERNAME=<YOUR USERNAME>
|
||||||
|
export NIO_LLM_PREPROMPT=<YOUR PREPROMPT>
|
||||||
|
|
||||||
|
# optional
|
||||||
|
export NIO_LLM_DEVICE_ID=nio-llm
|
||||||
|
export NIO_LLM_HOMESERVER=https://matrix.org
|
||||||
|
export NIO_LLM_GGML_REPOID=TheBloke/stable-vicuna-13B-GGML
|
||||||
|
export NIO_LLM_GGML_FILENAME=stable-vicuna-13B.ggmlv3.q5_1.bin
|
||||||
|
export NIO_LLM_SYNC_TIMEOUT=30000
|
||||||
|
|
||||||
|
nio_llm
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
### Config file
|
||||||
|
|
||||||
|
Create a config file with the following content:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
# config_file.yaml
|
||||||
|
|
||||||
|
# required
|
||||||
|
room: <YOUR ROOM>
|
||||||
|
password: <YOUR PASSWORD>
|
||||||
|
username: <YOUR USERNAME>
|
||||||
|
preprompt: <YOUR PREPROMPT>
|
||||||
|
|
||||||
|
# optional
|
||||||
|
device_id: nio-llm
|
||||||
|
homeserver: https://matrix.org
|
||||||
|
ggml_repoid: TheBloke/stable-vicuna-13B-GGML
|
||||||
|
ggml_filename: stable-vicuna-13B.ggmlv3.q5_1.bin
|
||||||
|
sync_timeout: 30000
|
||||||
|
```
|
||||||
|
|
||||||
|
Then run:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
nio_llm --config config_file.yaml
|
||||||
|
```
|
||||||
|
|
||||||
## Special thanks
|
## Special thanks
|
||||||
|
|
||||||
- https://github.com/abetlen/llama-cpp-python
|
- https://github.com/abetlen/llama-cpp-python
|
||||||
|
|
Loading…
Reference in a new issue