Running multi-GPU setup with shared or distributed Vram usage #6230
BrechtCorbeel
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Anybody has solutions for more usage of GPUs with Comfy?
Nvidia is about to go insane in pricing for very little Vram and very little increase in CUDA cores, the 5090 is a disaster, there should have been a 32GB 4090TI and the 5090 should have been 48GB, they purposely undercut the consumer GPUs because the same specs with higher Vram is 3-10 times the price in their business GPU range.
Solution would be to buy 5x 4060 TI with same specs as a 5090, but 80GB Vram. I think this can be done already for training, but for actually producing images or text and video's it's still all distributed on single Gpu's.
Beta Was this translation helpful? Give feedback.
All reactions