-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Try this before submitting an issue #19
Comments
Hi - great bot, thank you - how would I run multiple instances in the same VPS? |
Thanks! You can run as many instances of llmcord as you want. Simply clone the repo multiple times and run each one independently. |
I’ve been trying to launch using docker commands with separate config files and they all fail lol - 6 hours trying… it took me a while to get the 2nd config to work - but it’s now good thank you - I want to make a few different types of bots all for the same server and running on cheap $5 a month VPS 24/7 service… I am just waiting for my main cloud service for comfyui to get API calls - then next stage will be a comfyui Bot - calling the GPU only when it is generating |
So you mean multiple instances of llmcord in 1 docker container? You're not the first person to ask about this. I definitely see the value in making this simpler to accomplish. I'll keep thinking... |
It's easy. I'm in a trip. When I come back to the office next week I'll
look what I did there
Em sexta-feira, 22 de novembro de 2024, Jakob ***@***.***>
escreveu:
… So you mean multiple instances of llmcord in 1 docker container?
You're not the first person to ask about this. I definitely see the value
in making this simpler to accomplish. I'll keep thinking...
—
Reply to this email directly, view it on GitHub
<#19 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/APRK7SFS4QEKHMVIHFUGKU32B4X2PAVCNFSM6AAAAABDNVLL5SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDIOJTG44TGMBVHA>
.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
I’m up to 3 now, and of course 3 containers lol |
Awesome 😎 @mkagit you back yet? 👀 |
I'll try it! |
Installed this today, really easy and straightforward. Looking for a way to add some backstory for each server user through Discord. Possibly a pinned message with this info would help if this was always included in the context window. Thanks. |
One idea is to use threads. You can simply send a message introducing yourself, and then create a thread from it: Now every time you start a conversation within that thread, it will remember the original message: This functionality has existed in llmcord for awhile now. Here's the description from the README:
...Maybe I should reword that description to better highlight this use case. |
Thanks for the reply. That solves the problem for me for channel messages. Just need something for DMs now. |
I'm thinking of adding user/channel specific system prompts to the config soon. This would accomplish what you want. Stay tuned :) |
Questions that are probably dumb but I'll ask anyway. Can this bot read the channel history? I saw another one made in Javascript that could read a channel's history back to about 100 messages for context but wasn't sure if that was implemented here. I'm not particularly good at programming so I couldn't check the code myself to find it. Also, big question, will there be support for the openai assistants api? I have a pre-built assistant that just needs to be referenced as the model to use, but I don't think this bot accepts the assistants api. Does it and I just didn't see, or will that be a future addition, or is that not on the list? New to programming, but these were just some questions I had in setting this up for my server. Thank you for making it! It's fantastic regardless of my few issues. |
It only sees messages in the current reply chain. No plans for anything beyond that at the moment.
No - assistants work quite differently than regular LLMs so they wouldn't really make sense in llmcord. It'd be better off as an entirely separate project designed from the ground up for assistants. Thanks for the positive feedback! :) |
Of course! I guess I'll just do my own coding adventure over Christmas to give it some more complete memory. I figured a way around the assistants thing, now I just gotta figure out how that guy got the history in the first place :D |
In discord.py you can do channel.history(...), see more details here: https://discordpy.readthedocs.io/en/stable/api.html?highlight=history#discord.TextChannel.history |
Thank you so much! |
Something like this should work:
Or if you're using Docker:
This will save all the output logs to a text file. Note that only incoming user messages are shown in the log, not the bot's responses. You could add more logging in the code yourself fairly easily. |
Awesome! That's perfect. Thank you so much! |
Is there a way to ban specific users? I have a user who is being rather difficult, but enough ppl are using the bot that it'd be difficult to manually add all of them as allowed users. |
No official way to do that at the moment. You could create a special role and add it to You could also just hack the code and add a simple Maybe this needs to be easier. |
Sounds good. I used the special role option, but that disables DMs so it's a temporary fix. I'll work on getting a more concrete system in on Saturday. I'll probably just add what you said, but poking through the code trying to learn how it works is always fun XD |
Yeah, with how it works currently, DMs are either allowed for EVERYBODY or blocked for EVERYBODY. I agree that this isn't ideal... I'm thinking I should add a |
Done! 8d4d9cc |
Force-update to the latest version of llmcord:
git fetch && git reset --hard origin/main
(No Docker) Update python packages:
(With Docker) Rebuild the container:
Make sure your config.yaml is up-to-date with config-example.yaml (this project is WIP so the config structure is subject to change)
If you're using something like Ollama, make sure it's up-to-date
The text was updated successfully, but these errors were encountered: