- Agentic work flow by Langgraph
- RAG for the personal data
- Short-term Context Retention
- Multi modal (image, text only for now)
- use OpenAI model
- permission for each user
- custom prompt for each user
- fetch youtube transcription by Tor proxy
- ask the bot on DM or by tagging the bot
- whenever LLM model create next token, it's updated on the answer message in realtime
- if you ask again on the thread of the slack message, chatbot keep the short-term context
- attach image if you want to ask about the image
- python3.11
- install requirements.txt
- create slack app and add proper permissions
- set
.env
file by referring to .env_sample
python chatbot.py
configure deploy/config.sh
and run the below
this requires docker and docker-compose on the server.
tested on ec2 server
cd deploy
sh deploy_server.sh
- local chatbot for device automation
- personal AI assistant
- company chatbot (not updated here)
- RAG for source code. let user can decide scope like cursor IDE