nio-llm/README.md

21 lines
862 B
Markdown
Raw Normal View History

2023-05-24 20:48:39 +00:00
# Nio LLM
2023-05-24 20:45:57 +00:00
[![GitHub](https://img.shields.io/github/license/Laurent2916/nio-llm)](https://github.com/Laurent2916/nio-llm/blob/master/LICENSE)
2023-05-24 20:45:57 +00:00
[![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/psf/black)
[![Ruff](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v1.json)](https://github.com/charliermarsh/ruff)
You own little LLM in your matrix chatroom.
## Usage
2023-10-19 14:23:25 +00:00
This project is split in two parts: the client and the server.
2023-10-19 14:23:25 +00:00
The server simply downloads an LLM and starts a llama-cpp-python server (which mimics an openai server).
2023-10-19 14:23:25 +00:00
The client connects to the matrix server and queries the llama-cpp-python server to create matrix messages.
2023-05-24 20:45:57 +00:00
## Special thanks
- https://github.com/abetlen/llama-cpp-python
- https://github.com/ggerganov/llama.cpp/