You are viewing a single comment's thread from:RE: Adding external GPU to my AI boxView the full contextthecrazygm (71)in #ai • 17 hours ago I think it's impressive. Using 2 GPUs and booting from flash is a bit overkill, but would make me giggle.
Would be another 24G of vram allowing me to almost fully keep GPT-OSS-120B in vram probably speeding it up 200%.
How long would it take to load the gguf from flash though, I mean I guess if it only has to do it once.
Probably 10-20 minutes instead of 10-20 seconds. I do it once every 3 months or so. I do want to try 5090 on this setup, that would be a massive boost.