aboutsummaryrefslogtreecommitdiff
path: root/README.md
diff options
context:
space:
mode:
authorOxbian <oxbian@mailbox.org>2025-05-17 23:30:07 -0400
committerOxbian <oxbian@mailbox.org>2025-05-17 23:30:07 -0400
commitfecb211130ec487c7c617d28419c6d0097f19783 (patch)
treeade91c5eefb7d9af6f68357f897d4b670f325f81 /README.md
parentdd9808b10c98c28a493eac78742fc403efc70e32 (diff)
downloadNAI-fecb211130ec487c7c617d28419c6d0097f19783.tar.gz
NAI-fecb211130ec487c7c617d28419c6d0097f19783.zip
feat: wikipedia module
Diffstat (limited to 'README.md')
-rw-r--r--README.md13
1 files changed, 8 insertions, 5 deletions
diff --git a/README.md b/README.md
index 96795fd..31f2181 100644
--- a/README.md
+++ b/README.md
@@ -5,6 +5,11 @@ Néo AI, a personnal assistant using LLM.
A TUI interface for local llama.cpp LLM, in the future more functionnality will
be added to this AI.
+> [!CAUTION]
+> This project is designed to help me understand modern techniques around AI and
+> LLM. Other projects like [txtchat](https://github.com/neuml/txtchat) and
+> [LangRoid](https://langroid.github.io/langroid/) are more advanced and better for real use.
+
## Usage
### Dependencies
@@ -36,13 +41,12 @@ and there you go !
- Conversation are saved inside files in JSON in this folder `conv/`, and can be reused on others LLM.
- In normal mode, conversation can be resumed by the LLM into bullet point list.
- LLM can be configured thanks to configuration files in `config/`
-- Requests are routed thanks to the LLM to other expert LLM. Code questions are send to a code expert, wikipedia questions are send to a system which provide better factual response.
+- Requests are routed thanks to the LLM to other expert LLM. Code questions are send to a code expert, wikipedia questions are send to a module which use a kiwix API to provide data from Wikipedia.
## TODO
-- Color change if it's an user or the LLM
-- Async request to the LLM API
-- Start the real fun
+- Color change if it's an user or the LLM (dunno how to do it in this code base)
+- Connect & try LLM / tools
## Inspiration and reason
@@ -50,5 +54,4 @@ Why in Rust ?
Because I wanted to learn Rust, and the language is fast and powerful.
-
- [WilmerAI](https://github.com/SomeOddCodeGuy/WilmerAI/) a system where all the inferences are routed to others expert LLM.
ArKa projects. All rights to me, and your next child right arm.