Multi-GPU hardware setups: My method (feel free to share your own!)

Message boards : Number crunching : Multi-GPU hardware setups: My method (feel free to share your own!)
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · 4 . . . 8 · Next

AuthorMessage
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1958884 - Posted: 6 Oct 2018, 19:31:57 UTC
Last modified: 6 Oct 2018, 19:43:48 UTC

So I wanted to create a thread about multi-gpu machines. I'll show you how I go about my builds to maybe give people a better visual, and maybe introduce someone to something new.

I know most people here have historically run up to 4 GPUs on some rather expensive hardware (and pretty much requires watercooling due to density and card proximity. But if you only want to run SETI, or other tasks that dont heavily rely on PCIe bandwidth, you can get more GPUs in a single machine by using risers and significantly less expensive motherboards. From what others have commented in other threads here, other projects like Einstein@home do see a performance hit on low PCIe bandwidth connections, so keep your own personal goals in mind when setting up your machine. but for this thread, i'll focus on SETI which sees little or no performance impact all the way down to a PCIe x1 interface *as long as you have at least PCIe gen2, and ideally gen3* you WILL see a performance hit if you try this method on very old PCIe gen1 stuff.

There's more than one way to do this, but my goal on this system was to get everything inside a 4U server that could be mounted on a rack.it's not currently in a rack, but i have that option should i want to. i also didnt want to spend too much month on the parts that aren't doing the work.

So my build:
Case: Rosewill RSV-L4500, front HDD cages and all front bracketry completely removed.
GPU bracket: Spotswood drop-in bracket assembly: http://spotswoodcomputercases.com/wp/?page_id=9120
PSU1: HP common slot 1200W (900W on 120V) - powering the motherboard and 4x GPUs
PSU2: HP common slot 750W - powering 3x GPUs
2x PSU breakout boards : https://www.amazon.com/Supply-Breakout-Adapter-Support-Ethereum/dp/B078VMMV6D/ref=sr_1_8?s=electronics&ie=UTF8&qid=1538852429&sr=1-8&keywords=breakout+board
Custom PCIe 6-pin-> MB 8-pin and CPU 4-pin (to power the PicoPSU), i made this adapter myself, no source to buy i don't think, not needed if you use a more normal PSU setup
PSU3: 120W PicoPSU for the motherboard
Motherboard: ASUS Prime Z270-P
CPU: i7-7700k w/ HT enabled
RAM: 8GB (2x4GB) DDR4-2133
Risers: XPLOMOS v008S: https://www.amazon.com/EXPLOMOS-Graphics-Extension-Ethereum-Capacitors/dp/B074Z754LT/ref=sr_1_fkmr0_1?s=electronics&ie=UTF8&qid=1538851990&sr=1-1-fkmr0&keywords=explomos+v8+riser
M.2 PCIe adapter: https://www.amazon.com/EXPLOMOS-NGFF-Adapter-Power-Cable/dp/B074Z5YKXJ/ref=sr_1_1_sspa?s=electronics&ie=UTF8&qid=1538852725&sr=1-1-spons&keywords=m.2+to+pcie&psc=1
GPUs: 6x 1080ti + 1x 1060
Fans: 6x Noctua iPPC-2000
SSD: cheapo 120GB drive. use whatever you want.

Pics:






a few things to note.
1. you can use a normal PSU here, but there are a few reasons I did not. First, price, when i first put this together, it was cheaper to go this route than buying a quality 1600W PSU. Second, space. a high power PSU would not fit in this case while retaining the center fan wall, and i didnt want to give that up. Third, i needed a lot of PCIe power connections. Even with using some 8pin->2x8-pin splitters i'm using 13x ull-run PCIe power connections, not many PSUs have that many, and none that i know of have them on individual runs. i think the EVGA 1600W has 9x PCIe that are doubled on the ends.

2. don't forget about your m.2 interface! if you have a newer board that has an M.2 and it runs on PCIe and not just SATA, it's no different than a normal PCIe slot (electrically) and you can adapt a GPU to it!

3. you could also use an open air type setup using the same basic components that i have used, but mount them on a frame that is not enclosed. you'll need to get some airflow moving around the cards with some box fans or something. it will take up more space, but maybe be easier to deal with wiring and maintenance.

like this:

Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1958884 · Report as offensive
Profile Freewill Project Donor
Avatar

Send message
Joined: 19 May 99
Posts: 766
Credit: 354,398,348
RAC: 11,693
United States
Message 1958889 - Posted: 6 Oct 2018, 20:04:04 UTC - in response to Message 1958884.  

That is some seriously impressive hardware! Thanks for sharing. What OS and SETI optimizations are you running to go with the hardware?

Roger
ID: 1958889 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1958896 - Posted: 6 Oct 2018, 20:35:57 UTC - in response to Message 1958889.  

That is some seriously impressive hardware! Thanks for sharing. What OS and SETI optimizations are you running to go with the hardware?

Roger


Linux 18.04 and the CUDA Special App
ID: 1958896 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13161
Credit: 1,160,866,277
RAC: 1,873
United States
Message 1958898 - Posted: 6 Oct 2018, 20:36:42 UTC

Thanks for the post Ian. I wasn't aware of the M.2 riser solution at all. Very interesting rig for those 1080Ti's.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 1958898 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1959346 - Posted: 9 Oct 2018, 0:50:58 UTC

Thank you for the pictures. All us people running 2 to 4 gpu's now could have a "case" of gpu envy ;)


Tom
A proud member of the OFA (Old Farts Association).
ID: 1959346 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1959969 - Posted: 12 Oct 2018, 23:01:41 UTC

Browsing around I ran into a USB 3.1 to SSD/M.2 board which got me to thinking about Steve's M.2 to PCIe x16.

There are hundreds of different USB (3.1 or 3.0) to M.2 converters for external M.2 enclosures etc, so why couldn't you adapt that to PCIe ? Hmmm.

I did some searching and didn't find anything that does that directly.
ID: 1959969 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1960059 - Posted: 13 Oct 2018, 13:30:56 UTC - in response to Message 1959969.  

I have run across riser adaptors that will allow you to run 4 gpu's on risers from one pc? slot.

You could take an average MB with 4 PCex slots and a couple of short slots and put.... maybe 24 gpu's on it. Without a modded Bios it might choke when you try to boot though...

Tom
A proud member of the OFA (Old Farts Association).
ID: 1960059 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1960100 - Posted: 13 Oct 2018, 19:49:05 UTC - in response to Message 1960059.  

I have run across riser adaptors that will allow you to run 4 gpu's on risers from one pc? slot.

You could take an average MB with 4 PCex slots and a couple of short slots and put.... maybe 24 gpu's on it. Without a modded Bios it might choke when you try to boot though...

Tom


You can use them. But you see performance decreases when splitting the slot to more than 2 GPUs.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1960100 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1960968 - Posted: 19 Oct 2018, 16:35:12 UTC

@Ian&SteveC,
Congratulations on your latest RAC and position on the LeaderShip board on this rig!
A proud member of the OFA (Old Farts Association).
ID: 1960968 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1961493 - Posted: 22 Oct 2018, 15:55:10 UTC

Have you maxed out the total number of gpu's that you can have?

If your MB had more slots available, could you get (safely, with enough air cooling) any more gpu's into the case?

Tom
A proud member of the OFA (Old Farts Association).
ID: 1961493 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1961499 - Posted: 22 Oct 2018, 16:14:09 UTC
Last modified: 22 Oct 2018, 16:37:16 UTC

I can fit one more GPU via the second M.2 slot.

I would have to squeeze the front GPUs together to fit 7 instead of 6. It can be done, but I need to move some wiring around and power is a limiting factor in my specific system. 6x 1080tis use a good bit of power already.

I’ve considered doing it and then watercooling all 7 cards. It would free up some space. And get most of the heat out of the box to an external radiator, but again, power is a limiting factor right now.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1961499 · Report as offensive
Profile Brett Koski
Avatar

Send message
Joined: 27 Aug 16
Posts: 9
Credit: 182,944,505
RAC: 93
United States
Message 1962136 - Posted: 27 Oct 2018, 14:42:26 UTC

This thread needs more love!!!

Sorry for the HUGE pictures, I'm not sure how to re-size for forum use?
Also, if you think cable management is important, consider this your "trigger" warning!

ASUS X99-E-10G WS Motherboard
*Motherboard has 7 PCIe slots & a PCIe M.2 slot, hence the ability to run 8 GPU simultaneously.
*When the M.2 slot is used with a GPU, this is the card that drives the monitor. Unsure why.
Intel i7-5960x - 8/16 - @ ~4.0GHZ (water cooled)
2x EVGA Titan X Hybrid GPU (maxwell)
6x EVGA 1070SC Gaming GPU (pascal)
EVGA 1600 T2 PSU
I forget the H2O system, RAM, and other component stats, as this was a couple years ago.
Windows 10 and Stock SETI App.
Various Configurations (it continuously evolved):




Same rig with seven Zotac GT1030 single slot cards. This short-lived test was quite important for what, at the time, was my eventual goal for my dedicated SETI rig.


Rig as it sits now (finally back up and running!).
ASUS X99-E-10G WS Motherboard
Intel i7-5960x - 8/16 - @ ~3.8GHz (air cooled)
4x NVIDIA Titan Xp GPU (pascal) with EVGA 1080Ti Hybrid Water Cooler Kits
Linux Lubuntu 16.04 & CUDA90 App


A few notes*
The 8-GPU rig very nearly set my house on fire. If you are going to try a setup like this, make sure not only your PSU (or two) is capable of the stress, but your home wiring is up to the task as well. No joke, there were flames and a melted outlet, I was lucky to be home to witness the event and stop it immediately. Air cooled CPU and GPU are LOUD, Very loud. Not much you can do about thermal output, but switching to water-cooled CPU and GPU drops the noise pollution significantly. This brings me to the seven GPU test. I bought this motherboard specifically for an incredibly insane idea. Mount 7 GPU to the MOBO, all in a water loop, to have a ridiculous SETI rig, while maintaining a small physical and audible footprint. My goal was to use SEVEN NVIDIA Titan Xp GPU, like the four in the last picture above. These cards, when equipped with a water plate, physically become single-slot cards. To make a long story short, this is a good example of what I was after, before I went broke... HAHA:

https://rawandrendered.com/octane-render-hepta-gpu-build

That was my eventual goal, but I ran out of money so 4x Titan Xp will have to do haha! Even If I had the money to complete my build, I would still have (as Ian&Steve C. mentioned) power delivery issues. I'll have to wait until I own a house and can make some specific modifications to the electrical system before I feel comfortable adding much to this unit. A custom water loop with an external (read: outdoor) radiator is also on the list, but that's another insane idea that is just going to have to wait.

Let's see some more crazy rigs!!!!
ID: 1962136 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1962152 - Posted: 27 Oct 2018, 16:55:50 UTC

Brett you made me look at my XP card with a comment you made.
I never thought about it before but it doesn't have the DVI port like the 1080s - which has to be cut off to turn them into single slot cards.
I'm adding a waterblock to my todo list now.

P.S. Did you get that computer from scocam or change IDs? I just seem to remember this ID showing up when his became inactive ...
ID: 1962152 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13161
Credit: 1,160,866,277
RAC: 1,873
United States
Message 1962154 - Posted: 27 Oct 2018, 17:07:30 UTC - in response to Message 1961499.  

I can fit one more GPU via the second M.2 slot.

I would have to squeeze the front GPUs together to fit 7 instead of 6. It can be done, but I need to move some wiring around and power is a limiting factor in my specific system. 6x 1080tis use a good bit of power already.

I’ve considered doing it and then watercooling all 7 cards. It would free up some space. And get most of the heat out of the box to an external radiator, but again, power is a limiting factor right now.

Two separate 20A circuits would do it. Each circuit on a different leg of the incoming 240 house supply. Plug each power supply into its own circuit.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 1962154 · Report as offensive
Profile Brett Koski
Avatar

Send message
Joined: 27 Aug 16
Posts: 9
Credit: 182,944,505
RAC: 93
United States
Message 1962156 - Posted: 27 Oct 2018, 17:34:57 UTC - in response to Message 1962152.  

Brett you made me look at my XP card with a comment you made.
I never thought about it before but it doesn't have the DVI port like the 1080s - which has to be cut off to turn them into single slot cards.
I'm adding a waterblock to my todo list now.

P.S. Did you get that computer from scocam or change IDs? I just seem to remember this ID showing up when his became inactive ...



Being able to go single-slot without cutting, and the ability to revert back to factory condition is the primary reason I went with the Titan Xp over the 1080 series. Plus, seven Titan XP all in one case... it would have been glorious haha! (maybe some day)
I changed user names a while back. Used to be "uswg01". This main SETI rig has also gone through so many different hardware changes (mobo, HDD, SSD, GPU, CPU... etc) over the last two years it has been hard to keep track. Most of the computers in my history are in fact this one machine. I kind of wish there was a way to manually merge machines in the stats, but I also understand why we can't. The computer is all mine though, it's been one heck of a learning process getting everything to work together!
ID: 1962156 · Report as offensive
Profile Bernie Vine
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 26 May 99
Posts: 9954
Credit: 103,452,613
RAC: 328
United Kingdom
Message 1962157 - Posted: 27 Oct 2018, 17:48:25 UTC

Sorry for the HUGE pictures, I'm not sure how to re-size for forum use?


On Imgur, open the picture, in the bottom right hand corner you will see.




Edit image, click on that and you the picture will open in edit mode.

In top right hand corner will be the image size



Change the first number to around 1100 and save. (yours are currently 5024x2824 much too big)

Due to recent forum changes large images cause real problems with some browsers.
ID: 1962157 · Report as offensive
Profile zoom3+1=4
Volunteer tester
Avatar

Send message
Joined: 30 Nov 03
Posts: 65709
Credit: 55,293,173
RAC: 49
United States
Message 1962171 - Posted: 27 Oct 2018, 21:33:21 UTC - in response to Message 1962136.  

Nice setup you have there, and you are spot on about electrical wiring, I have a bit left to do here before I can run more than 2 PCs, like replace 20 outlets and add 4 new circuits with 4 new outlets, and add an a/c unit, I was told once that 18,000 BTU's would be the amount I'd need to cool this home of Mine, so far I only have 10,000 BTU's. Plus I need to replace 2 windows and get a 12,000 BTU a/c unit, I live in a desert. Now I just need the money, at least I have the parts. So far this home has had 5 outlets, 3 light switches, and all but the main breaker replaced. I had one outlet burn itself and 2 wires up causing a whole circuit to go dead, got that fixed. Beware the outlets with no screws on the sides, the 20 I have to replace are like that and are exactly like the one that burned up.
The T1 Trust, PRR T1 Class 4-4-4-4 #5550, 1 of America's First HST's
ID: 1962171 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1967016 - Posted: 25 Nov 2018, 17:28:16 UTC - in response to Message 1958884.  


So my build:
Case: Rosewill RSV-L4500, front HDD cages and all front bracketry completely removed.
GPU bracket: Spotswood drop-in bracket assembly: http://spotswoodcomputercases.com/wp/?page_id=9120
PSU1: HP common slot 1200W (900W on 120V) - powering the motherboard and 4x GPUs
PSU2: HP common slot 750W - powering 3x GPUs
2x PSU breakout boards : https://www.amazon.com/Supply-Breakout-Adapter-Support-Ethereum/dp/B078VMMV6D/ref=sr_1_8?s=electronics&ie=UTF8&qid=1538852429&sr=1-8&keywords=breakout+board
Custom PCIe 6-pin-> MB 8-pin and CPU 4-pin (to power the PicoPSU), i made this adapter myself, no source to buy i don't think, not needed if you use a more normal PSU setup
PSU3: 120W PicoPSU for the motherboard
Motherboard: ASUS Prime Z270-P
CPU: i7-7700k w/ HT enabled
RAM: 8GB (2x4GB) DDR4-2133
Risers: XPLOMOS v008S: https://www.amazon.com/EXPLOMOS-Graphics-Extension-Ethereum-Capacitors/dp/B074Z754LT/ref=sr_1_fkmr0_1?s=electronics&ie=UTF8&qid=1538851990&sr=1-1-fkmr0&keywords=explomos+v8+riser
M.2 PCIe adapter: https://www.amazon.com/EXPLOMOS-NGFF-Adapter-Power-Cable/dp/B074Z5YKXJ/ref=sr_1_1_sspa?s=electronics&ie=UTF8&qid=1538852725&sr=1-1-spons&keywords=m.2+to+pcie&psc=1
GPUs: 6x 1080ti + 1x 1060
Fans: 6x Noctua iPPC-2000
SSD: cheapo 120GB drive. use whatever you want.


Ian&SteveC,
I have a couple of questions about the components of your parts list.

If I am understanding it right, your riser cards power their gpu slots directly from a PSU rather than drawing off the motherboard PCIe slots?
I think you are using a PSU adaptor to provide more plugins for the gpu modular cables?
And you bought extra modular PSU cables to power from the modular PSU to the riser boards?

As long as you are running 8 or less gpus, pretty much any motherboard/cpu should work? I am thinking of an e5-2670v1 & cheap MB that I have.

Do you have a recommended PCIe spliter card? The MB I have only has 4 slots or so and it isn't modern enough to have any of those new storage product slots.

Conversations in another thread got me wondering about re-deploying a system that I have since I don't want to be handicapped by a 1050/1051(?) Miner MB cpu socket. I like my cheap high core counts :)

Thank you,
Tom
A proud member of the OFA (Old Farts Association).
ID: 1967016 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1967031 - Posted: 25 Nov 2018, 19:07:46 UTC - in response to Message 1967016.  
Last modified: 25 Nov 2018, 19:11:42 UTC

there's more to it than just having enough PCIe lanes available. some motherboards just can't handle a high number of GPUs and will fail to POST. something about not having the proper resources or settings in the BIOS to address all the VRAM of the GPUs. from what reading i've done, the "Above 4G decoding" addresses this and many older motherboards and server motherboards do not have this option and likely wont work with more than about 4 GPUs or so. you just have to try and see where the limit is for your setup. my server board housing my 2690v1 chips wont do more than 4 GPUs. some other server boards i have wont do more than 3.

and yes, my risers are powered from the PSU directly and not from the motherboard. the only connection to the motherboard is the PCIe data signal wires.

on this system, i am using HP common slot server power supplies. one 900w (rated for 1200w, but only on 200+V, 110v puts it down to 900w) and one 750w.
the 900w PSU is powering the motherboard, 3x 1080tis and a 1060.
the 750w PSU is powering the other 3x 1080tis.

(the 1080tis are also power limited to 200w from their 250w default, i do this to reduce strain on the circuit, as well as improve the power efficiency. there wasnt much performance hit doing this, but there was some)

because HP common slot PSUs are cheap and plentiful, many enterprising Chinese companies developed PCB adapter boards to adapt the HP server PSUs to standard GPU 6-pin 12v outputs for the purposes of GPU crypto mining. these will NOT power a computer outright as they only output 12v power. a normal motherboard needs a whole host of different voltages (+5V, +3.3V, +/-12V, etc). what i have done to get around this, is use a PicoPSU which takes the 12V output from the HP PSU and converts it to all the other voltages needed by the motherboard. i had to custom wire this myself, no adapters are available for this kind of thing. i also custom wired in the 8-pin CPU power connection from the GPU power plugs on the HP breakout board. I wouldn't recommend you try this yourself though. just stick to a normal PSU and make it easier on yourself. with the dual PSUs, adapter boards, cables, PicoPSU, it came out to be not much cheaper than just using a high wattage PSU. I would recommend you get something like the EVGA 1000+W T2 or P2 model power supplies.

the only real advantage to my setup is having many more GPU PCIe power cables than a normal PSU. something like the the EVGA 1600w PSU has 9x VGA cables. but my setup can provide 18 connections (2 of which are being used to power the CPU/motherboard).
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1967031 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1967113 - Posted: 25 Nov 2018, 21:53:00 UTC - in response to Message 1967031.  

Thank you for clearing that up.

If/when things fall out a way that is possible I may start tinkering with that box to see if it is possible to even boot that many gpus. If it is, maybe I will get serious and setup a couple of sawhorses and a sheet of wood so I can see what I am doing :) And probably buy some more gtx 1060 3GB's. Their used price keeps going down :)

Tom
A proud member of the OFA (Old Farts Association).
ID: 1967113 · Report as offensive
1 · 2 · 3 · 4 . . . 8 · Next

Message boards : Number crunching : Multi-GPU hardware setups: My method (feel free to share your own!)


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.