Multi-GPU hardware setups: My method (feel free to share your own!)

Message boards : Number crunching : Multi-GPU hardware setups: My method (feel free to share your own!)
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · 8 · Next

AuthorMessage
Profile IntenseGuy

Send message
Joined: 25 Sep 00
Posts: 190
Credit: 23,498,825
RAC: 9
United States
Message 1986786 - Posted: 23 Mar 2019, 17:50:18 UTC - in response to Message 1986753.  

The space between gpus 9 and 10 IS a toaster! !!
ID: 1986786 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 1986799 - Posted: 23 Mar 2019, 19:28:18 UTC

That is a sweet piece of engineering Ian. Congratulations.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 1986799 · Report as offensive
Profile bloodrain
Volunteer tester
Avatar

Send message
Joined: 8 Dec 08
Posts: 231
Credit: 28,112,547
RAC: 1
Antarctica
Message 1986857 - Posted: 24 Mar 2019, 4:35:33 UTC

lunk what the hear overall that the cards where running on . when you did the test?
also i will finale do a proper multi card boinc machine. with a second gpu i got a rx460 that i got today for 50 bucks. idiot listed it wrong. so they keep lowering the price.
this will go with the current rx580 i have
ID: 1986857 · Report as offensive
Profile Wiggo
Avatar

Send message
Joined: 24 Jan 00
Posts: 36814
Credit: 261,360,520
RAC: 489
Australia
Message 1986861 - Posted: 24 Mar 2019, 5:35:34 UTC - in response to Message 1986857.  

lunk what the hear overall that the cards where running on . when you did the test?
also i will finale do a proper multi card boinc machine. with a second gpu i got a rx460 that i got today for 50 bucks. idiot listed it wrong. so they keep lowering the price.
this will go with the current rx580 i have
Sorry Bloodrain, but that is a Linux/Cuda combo and AMD/ATi doesn't support Nvidia's Cuda operations to start with so you're out of luck there until you do have the proper OS/Hardware combo to use that special Linux Cuda app. ;-)

Cheers.
ID: 1986861 · Report as offensive
Profile bloodrain
Volunteer tester
Avatar

Send message
Joined: 8 Dec 08
Posts: 231
Credit: 28,112,547
RAC: 1
Antarctica
Message 1986862 - Posted: 24 Mar 2019, 5:44:33 UTC - in response to Message 1986861.  

i know that. i was talking about the heat from your test bench set up. in general
ID: 1986862 · Report as offensive
marmot
Avatar

Send message
Joined: 15 May 99
Posts: 144
Credit: 1,220,664
RAC: 0
United States
Message 1987039 - Posted: 25 Mar 2019, 15:10:38 UTC - in response to Message 1986737.  

now with 10 GPUs :)



This system has now passed 1,000,000 RAC :))))

I believe you are qualified to win the SETI Toaster. Nice work.

I imagine this baby pushed around 2 KW form the AC outlet when running at full throttle!



For real, how much power is it drawing?
ID: 1987039 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1987058 - Posted: 25 Mar 2019, 16:04:06 UTC - in response to Message 1987039.  

I wish I could give you a real number, but I don't have a watt meter capable of measuring 240v through a C14 connector.

best I can do is estimate:
all 1080ti's are power limited to 200W max x6 = 1200w
all 2070s use about 175W max x3 = 525W
the CPUs are 115w TDP, running at 80% load x2 ~200w
lets say another 50w to be conservative for MB/RAM/SSDs/fans/etc

so that's looking at 1975W PEAK power. but in reality it's less because it's not always running at peak as the jobs start and stop and there are a few seconds of downtime for each card as it starts and stops WU jobs.

i'd say probably 1700-1800W-ish?
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1987058 · Report as offensive
Profile bloodrain
Volunteer tester
Avatar

Send message
Joined: 8 Dec 08
Posts: 231
Credit: 28,112,547
RAC: 1
Antarctica
Message 1987133 - Posted: 25 Mar 2019, 22:59:55 UTC - in response to Message 1987058.  

give or take on that. seeing their are small power different between cards.
with ian on that.
ID: 1987133 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1988162 - Posted: 31 Mar 2019, 16:13:08 UTC - in response to Message 1987058.  
Last modified: 31 Mar 2019, 16:36:54 UTC

I got some more accurate estimates. I logged the GPU power draw via nvidia-smi, then took averages of each GPU. I sampled each GPU every 5 seconds for a period of 20 mins. that's enough to get data for about 20 WUs on each GPU, with about 12 samples over the course of each WU.

all GPUs are power limited to 200W

here's the command I used for logging if anyone else wants to (just edit the location of the log file and whatever parameters you want to log)
timeout 1200s nvidia-smi --query-gpu=timestamp,index,power.draw,name --format=csv -l 5 -f /home/rootx/powerlog.csv


you can see what options are available to query here: https://briot-jerome.developpez.com/fichiers/blog/nvidia-smi/list.txt
or run in the terminal:
nvidia-smi --help-query-gpu


Averages
GPU0 - 161.2 W - RTX 2070
GPU1 - 175.7 W - RTX 2070
GPU2 - 157.4 W - RTX 2070
GPU3 - 170.6 W - RTX 2070
GPU4 - 178.7 W - GTX 1080ti
GPU5 - 161.9 W - RTX 2070
GPU6 - 179.3 W - GTX 1080ti
GPU7 - 178.4 W - GTX 1080ti
GPU8 - 178.7 W - GTX 1080ti
GPU9 - 180.4 W - GTX 1080ti

Total GPU power average = 1722W


RTX 2070 overall average = 165W
GTX 1080ti overall average = 179W

just for fun, peak values. good to make sure you have enough theoretical PSU headroom.

Max Power
GPU0 - 186.9 W - RTX 2070
GPU1 - 203.9 W - RTX 2070
GPU2 - 181.0 W - RTX 2070
GPU3 - 196.3 W - RTX 2070
GPU4 - 219.6 W - GTX 1080ti
GPU5 - 184.5 W - RTX 2070
GPU6 - 222.4 W - GTX 1080ti
GPU7 - 219.7 W - GTX 1080ti
GPU8 - 213.3 W - GTX 1080ti
GPU9 - 223.2 W - GTX 1080ti

Total GPU power peak = 2050W


so the RTX 2070s are saving me almost 10% in power. maybe a little more if i back them off their 200W setting down to their stock of 175-185W
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1988162 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1988174 - Posted: 31 Mar 2019, 17:23:13 UTC - in response to Message 1988162.  

i checked my 1080ti in another system with the same logging which are still at 250W power limit (and watercooled)

Average = 190W
Peak = 235W
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1988174 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1988240 - Posted: 1 Apr 2019, 2:24:16 UTC

10% of a large value is a smaller large value. So it will make a difference on your operations bill.

Tom
A proud member of the OFA (Old Farts Association).
ID: 1988240 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1991478 - Posted: 25 Apr 2019, 21:30:45 UTC

This looks like a good thread to kick off a discussion of good cases for 6 GPU Boinc boxes as well as 7 GPU Boinc boxes.

My current cases have one or both sides off. One case, to take up less space I have 4 gpus perched on top of it. (MB is about AT-ish) The other has 4 sprawled out on the table beside it (MB is E-ATX).

I have been looking at open air mining case reviews. And it has a nice list. But all of them have a "hard to assemble" in the Con column of the Pros/Cons.

So I am looking for advice.

That last open air mining case I got had lots of pieces and I never did figure out how you "mount" the gpus. There was no plate to put them on or anything.

Thank you,

Tom
A proud member of the OFA (Old Farts Association).
ID: 1991478 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1991498 - Posted: 25 Apr 2019, 23:37:14 UTC - in response to Message 1991478.  

Tom,

This is the frame I have on my Beast.

6 GPU Mining Case Rig Aluminum Stackable Preassembled Open Air Frame

-$30 including shipping
-it's easy to assemble (some of it comes pre-assembled)
-easily holds 6 GPUs (more if you double up like i have, there's enough space between them if using 2 slot cooler GPUs)
-comes with standoffs to mount a normal ATX motherboard (i only needed the acrylic mount plate because the motherboard i'm using is proprietary)
-can mount 2x full sized ATX PSUs

my curent setup with 10 GPUs


Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1991498 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1991554 - Posted: 26 Apr 2019, 12:07:42 UTC - in response to Message 1991498.  

Tom,

This is the frame I have on my Beast.

6 GPU Mining Case Rig Aluminum Stackable Preassembled Open Air Frame

-$30 including shipping
-it's easy to assemble (some of it comes pre-assembled)
-easily holds 6 GPUs (more if you double up like i have, there's enough space between them if using 2 slot cooler GPUs)
-comes with standoffs to mount a normal ATX motherboard (i only needed the acrylic mount plate because the motherboard i'm using is proprietary)
-can mount 2x full sized ATX PSUs

my curent setup with 10 GPUs



Thank you. It says it has no fans. I have 4 fans leftover from my last open air purchase. What fans did you order?

I have ordered the frame. I may go back and order another once I get done with the first one.

Tom
A proud member of the OFA (Old Farts Association).
ID: 1991554 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1991561 - Posted: 26 Apr 2019, 13:36:53 UTC - in response to Message 1991554.  

I used extra fans I had from a previous build.

Noctua iPPC-2000

But you can just as easily use cheaper fans if you’re not using as many GPUs as I am.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1991561 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1992445 - Posted: 3 May 2019, 16:23:07 UTC
Last modified: 3 May 2019, 16:23:30 UTC

Finally got the transplant done. It is taking up more square footage on the table but now it looks like a well organized breadboard (which probably what it was supposed to look like :)

This was phase 1. I still need to buy another fan and see if I can get the fans hung. But I am not going to stress out because it was running without external gpu fans previously.

I ended up with a board in place because I have a bunch of short(er) gpus rather than the full sized ones it is designed for. It talks about a way to shift the support bar but that requires significant disassembly.

I am almost certainly going to switch the Intel box too. Even if I don't upgrade the MB. I like the appearance!

Tom

ps, I think I may try the 7 gpu setup again. (thud, sound of system failing to boot :)
A proud member of the OFA (Old Farts Association).
ID: 1992445 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1993633 - Posted: 13 May 2019, 14:37:39 UTC

I am pleased to note that my AMD box with 9 gpus has been running without crashing since Saturday afternoon. My previous experience with running a 9 gpu setup on my Intel box suggests that if it runs more than 3 days non-stop, then it MIGHT be stable.

I think I will shoot for 5 days. If it is still running then, I will start transplanting my high end gpus from my Intel box into the AMD system. At least get the AMD box to the point where it has 3 gtx 1070's which turn out to run about 10 seconds slower than my 3 gtx 1070Ti's do but at a significant cost reduction.

Tom
A proud member of the OFA (Old Farts Association).
ID: 1993633 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1993635 - Posted: 13 May 2019, 15:20:37 UTC - in response to Message 1993633.  

I am pleased to note that my AMD box with 9 gpus has been running without crashing since Saturday afternoon. My previous experience with running a 9 gpu setup on my Intel box suggests that if it runs more than 3 days non-stop, then it MIGHT be stable.

I think I will shoot for 5 days. If it is still running then, I will start transplanting my high end gpus from my Intel box into the AMD system. At least get the AMD box to the point where it has 3 gtx 1070's which turn out to run about 10 seconds slower than my 3 gtx 1070Ti's do but at a significant cost reduction.

Tom


great to hear that those new cables are holding up well so far.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1993635 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 1993637 - Posted: 13 May 2019, 15:30:13 UTC

My Beast also has been running smooth as butter.

8 cards at 8x PCIe gen3 is unheard of from anyone else. no problems with power delivery through the PCIe slots if you set things up properly like I have, and do the research on your parts selection to get an idea of how much power the card is likely to pull from the PCIe slot. I've proven the setup to be safe and stable, been running for several months now like this, but I'll measure the current to the 2 cards on risers just for additional validation/verification of power draw from the slot.

the system runs 10 cards total, but 2 cards are on risers at the moment. but only because of clearance issues with the rear CPU heatsink (the last two cards sit over it, and i don't have any right angle adapters at the moment)

I'm on the fence if I should leave things as is, get some right angle adapters to get those last 2 cards on full lanes, or toss a couple PCIe 4-in-1 splitters to see if the system will run 16 GPUs lol. It's no rush, being about 2x as fast as the next contender :)
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1993637 · Report as offensive
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5126
Credit: 276,046,078
RAC: 462
Message 1993639 - Posted: 13 May 2019, 15:52:34 UTC - in response to Message 1993637.  

Ah, but wait till I get my "unlimited budget" untracked. Then I can "dog paddle" a little faster..... may give you a run for your "money"....

Nah.... Haven't got the budget. Being Semi-Retired and unemployed because I don't want to work at a call center again right now... I think the best I could do would be re-deploy all my 1070 assets into one or the other machines and fill out any leftover slots with 1060's.

That will leave the "other" box with an "all gtx 1060 3GB" cast but based on past experience it should stay on at least the 2nd page of the leader board no matter which machine it is.

Unfortunately, the AMD (TB350-BTC) Motherboard only has 2 Gen3 slots. Everything else is Gen2.

I really would like to see if you can exceed your Pcie "resource limit" or other head room limit (excepting not enough PSU watts) and where it tops out at. Based on my current experience I would use the high end UGreens with 1 1/2 foot lengths. I suspect the shorter length of the cable makes the resources more usable.

Tom
A proud member of the OFA (Old Farts Association).
ID: 1993639 · Report as offensive
Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · 8 · Next

Message boards : Number crunching : Multi-GPU hardware setups: My method (feel free to share your own!)


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.