Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Can I use this docker image if I want to run an on-demand pod (not serverless) ? #80

Open
vesper8 opened this issue Nov 20, 2024 · 3 comments
Labels
bug Something isn't working

Comments

@vesper8
Copy link

vesper8 commented Nov 20, 2024

I just created a new template on Runpod with these settings:

Screenshot 2024-11-20 at 4 08 46 AM

I then deployed a new on-demand 4090 instance from the community cloud

deployment worked but I cannot access the ports 8188 or 8000, and if I try to SSH in, it works for a second and then immediately disconnects me.

I am able to deploy other templates on this community cloud GPU so I think the problem must have to do with the docker image or template.

I don't want to use the serverless offering on Runpod because the on-demand is much cheaper for me. Is there a reason why it isn't working if I don't set it to serverless?

Many thanks

@vesper8 vesper8 added the bug Something isn't working label Nov 20, 2024
@TimPietrusky
Copy link
Member

@vesper8 thanks for reporting this.

The purpose of this image is to be used as a serverless endpoint, but if you supply the environment variable SERVE_API_LOCALLY with the value true, then you can access the ComfyUI frontend via port 8188 and it will also not shut down immediately.

That being said, RunPod also provides ComfyUI templates for Pods, which you can find via Explore, for example "ComfyUI with Flux1-Dev".

Please let me know if this is working out for you!

@vesper8
Copy link
Author

vesper8 commented Nov 20, 2024

@TimPietrusky Thank you for your reply!

I'm most interested in using the API that your worker exposes. My intent is only to use my pods via API and I especially like that your worker allows image uploads in base64 which seems to be something that other pod templates don't provide.

So all I have to do is set the env variable SERVE_API_LOCALLY to true and I'll be able to use it the same way I've been using it on my local network?

I do like Runpod's serverless offering with the auto scaling and all but I figure I can achieve the same functionality with regular pods and manual (auto)scaling for a fraction of the price if I use spot instances and instances from their community cloud. I'm not rich enough to go full serverless right now.

@TimPietrusky
Copy link
Member

TimPietrusky commented Nov 21, 2024

@vesper8 thanks for the clarification. So what you want is to use the API from this project, but use it on a Pod to save money.

I think this should work as when you activate SERVE_API_LOCALLY this project will expose an API that looks exactly the same as the one that is exposed on serverless. But I haven't done this on a pod yet, so please let me know how this is working out for you. Because in theory this https://github.com/blib-la/runpod-worker-comfy?tab=readme-ov-file#access-the-local-worker-api applies and you should be able to use the API on port 8000 (and of course you should expose this port on your pod if you want to access it).

I think the only thing that you will not get is some form of authentication, but it might be that this can also be enabled somehow, so there is some stuff to explore :D

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants