Building a 32 thread xeon system doesn't need to cost a lot

Message boards : Number crunching : Building a 32 thread xeon system doesn't need to cost a lot
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 5 · 6 · 7 · 8 · 9 · 10 · 11 . . . 12 · Next

AuthorMessage
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1825304 - Posted: 19 Oct 2016, 3:05:37 UTC

Has anyone seen any good deals on dual LGA2011 motherboards recently for the Xeons E5-2670/E5-2660's? The CPUs are still going for very little, but the MBs seem to be a really hard thing to come by.
With my recent change in job and increase in finds I was hoping to put something along these lines together for my home VMware box.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1825304 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1825321 - Posted: 19 Oct 2016, 4:26:12 UTC - in response to Message 1825304.  
Last modified: 19 Oct 2016, 4:27:41 UTC

Has anyone seen any good deals on dual LGA2011 motherboards recently for the Xeons E5-2670/E5-2660's? The CPUs are still going for very little, but the MBs seem to be a really hard thing to come by.
With my recent change in job and increase in finds I was hoping to put something along these lines together for my home VMware box.


Here you go:

Intel MB, dual E5-2670s and 128GB RAM, under $500 (note that the 2670s have gone up about $20 each in the last 2-3 months on eBay):

http://www.natex.us/category-s/1885.htm?searching=Y&sort=5&cat=1885&show=4&page=1
ID: 1825321 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1827333 - Posted: 29 Oct 2016, 15:10:19 UTC - in response to Message 1825321.  

Has anyone seen any good deals on dual LGA2011 motherboards recently for the Xeons E5-2670/E5-2660's? The CPUs are still going for very little, but the MBs seem to be a really hard thing to come by.
With my recent change in job and increase in finds I was hoping to put something along these lines together for my home VMware box.


Here you go:

Intel MB, dual E5-2670s and 128GB RAM, under $500 (note that the 2670s have gone up about $20 each in the last 2-3 months on eBay):

http://www.natex.us/category-s/1885.htm?searching=Y&sort=5&cat=1885&show=4&page=1

The $175 for the bare board is pretty much what I was hoping to spend for a MB, but their package is a pretty good deal with the CPUs & 128GB. They have a 64GB package as well, but it is only $23 less.
It also looks like it will work with the E5-2670 V2 chips I managed to get really cheaply at a local place.

I was already looking at a pair of Noctua NH-D9DX i4 3U. I'll just have to make sure there is clearance for them once the MB arrives.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1827333 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1827633 - Posted: 31 Oct 2016, 2:10:38 UTC - in response to Message 1827333.  

On my E5-2670 dualie, using a low-end Asrock board (EP2C602 - only $300 new), I use a pair of Corsair H50s for cooling, because I had them lying around. They keep the CPUs around 60 C 24/7.
ID: 1827633 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1827638 - Posted: 31 Oct 2016, 3:03:17 UTC - in response to Message 1827633.  

On my E5-2670 dualie, using a low-end Asrock board (EP2C602 - only $300 new), I use a pair of Corsair H50s for cooling, because I had them lying around. They keep the CPUs around 60 C 24/7.

I have also considered using the all in one water coolers. I'm going to be putting the systems in a Thermaltake Core X9 chassis. So I shouldn't have to worry about having enough room, in the case, for proper CPU cooling. It depending on what kind of clearances I have around the CPUs and memory that will really be the deciding factor on what kind of cooling I end up using.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1827638 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1827792 - Posted: 31 Oct 2016, 23:48:49 UTC - in response to Message 1827638.  

I didn't take any chances with size; I have a Xigmatek Elysium *monster* case. I also have a black Corsair Graphite 780T for my next dualie (if I can get the $$$ for the parts, that is...).
ID: 1827792 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13736
Credit: 208,696,464
RAC: 304
Australia
Message 1827830 - Posted: 1 Nov 2016, 5:22:33 UTC - in response to Message 1827792.  

I like Al's method- Leave them naked.
Grant
Darwin NT
ID: 1827830 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1827914 - Posted: 2 Nov 2016, 2:47:02 UTC - in response to Message 1827830.  
Last modified: 2 Nov 2016, 2:48:21 UTC

I like Al's method- Leave them naked.

Yeah when you have a load of systems having them all open is kind of neat. In the classic days I had a metal wire racks that I used to hold systems.
At that point I was using something like like S370 Celeron 400MHz systems zip tied back to back with 1 PSU for each of the two MBs. Then lined up each pair vertically between upper and lower shelves. I had a few pictures of the setup at one point, but I think they may have been lost to time. :(
Currently only my Dual Xeon(R) CPU X5470 is open, but it really needs to be mounted to at least a MB tray for the coolers to make correct contact wit the CPUs. Due to the nature of the cooling design Intel implemented in that generation. Otherwise it looks alright to me.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1827914 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13736
Credit: 208,696,464
RAC: 304
Australia
Message 1827934 - Posted: 2 Nov 2016, 4:44:42 UTC - in response to Message 1827914.  

Otherwise it looks alright to me.

I'm thinking one big 400mm fan on a frame sitting over the top of it all would be just about right.


With the cooling issues I have here, there have been many occasions I've considered getting a fan from a car radiator & running it on 9V or so to keep the noise down to a dull roar & putting it in a frame above the system (both of my systems are in tower cases, but lying on their sides).
Grant
Darwin NT
ID: 1827934 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1828294 - Posted: 4 Nov 2016, 13:43:18 UTC - in response to Message 1827914.  

I like Al's method- Leave them naked.

Yeah when you have a load of systems having them all open is kind of neat. In the classic days I had a metal wire racks that I used to hold systems.
At that point I was using something like like S370 Celeron 400MHz systems zip tied back to back with 1 PSU for each of the two MBs. Then lined up each pair vertically between upper and lower shelves. I had a few pictures of the setup at one point, but I think they may have been lost to time. :(
Currently only my Dual Xeon(R) CPU X5470 is open, but it really needs to be mounted to at least a MB tray for the coolers to make correct contact wit the CPUs. Due to the nature of the cooling design Intel implemented in that generation. Otherwise it looks alright to me.

Yep, I really like the open, free and breezy deal, and funny you should mention that, I am going to be setting up racks similar to what you described over the winter, it is going to be an interesting project. Trying right now to figure out how many boards I can power off of one PSU, 2 is easy, more is challenging, but I have plenty of time to figure it all out. :-) Like you my challenge right now is the motherboard trays for them, but I am possibly getting a machine which might be able to solve that problem for me, and if it works out, possibly for others as well down the road. Like usual, too many projects and too little time.

ID: 1828294 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1828325 - Posted: 4 Nov 2016, 16:05:31 UTC

I saw this on eBay and bought it:

Brand New Asus Z9PE-D16 Dual LGA 2011/Socket R Server Motherboard

<$160 including shipping (bare board, not even I/O shield). The seller appears to be a relatively new seller (feedback of only 6), of server boards CHEAP; I looked at his feedback and decided to take a chance. Hope it's the dualie board I have been lusting after...
ID: 1828325 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1828352 - Posted: 4 Nov 2016, 21:49:05 UTC - in response to Message 1828325.  

Nice! Let us know how it goes when you get it. Any thoughts on the Procs and memory you are going to be putting into it?

ID: 1828352 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1828362 - Posted: 4 Nov 2016, 23:27:52 UTC - in response to Message 1828352.  

Nice! Let us know how it goes when you get it. Any thoughts on the Procs and memory you are going to be putting into it?


I have a pair of e5-2670s (v1) and a bunch of ram to play with. It won't go online for awhile, but I am aiming to upgrade i7-3870-pc to a dualie. I had one cpu lying around and one is currently in i7, along with 4x4GB sticks of reg ecc 1333 ram left over from a dual X5675 Z800 MB machine. It will have the 2 x 980s from i7, + 1 or 2 750tis (if they will fit). I may have to buy some more ram, but maybe not, as my current dualie (big32) is using only 4gb according to Task Manager.

We shall see!!!!
ID: 1828362 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1828363 - Posted: 4 Nov 2016, 23:29:18 UTC - in response to Message 1828294.  

I like Al's method- Leave them naked.

Yeah when you have a load of systems having them all open is kind of neat. In the classic days I had a metal wire racks that I used to hold systems.
At that point I was using something like like S370 Celeron 400MHz systems zip tied back to back with 1 PSU for each of the two MBs. Then lined up each pair vertically between upper and lower shelves. I had a few pictures of the setup at one point, but I think they may have been lost to time. :(
Currently only my Dual Xeon(R) CPU X5470 is open, but it really needs to be mounted to at least a MB tray for the coolers to make correct contact wit the CPUs. Due to the nature of the cooling design Intel implemented in that generation. Otherwise it looks alright to me.

Yep, I really like the open, free and breezy deal, and funny you should mention that, I am going to be setting up racks similar to what you described over the winter, it is going to be an interesting project. Trying right now to figure out how many boards I can power off of one PSU, 2 is easy, more is challenging, but I have plenty of time to figure it all out. :-) Like you my challenge right now is the motherboard trays for them, but I am possibly getting a machine which might be able to solve that problem for me, and if it works out, possibly for others as well down the road. Like usual, too many projects and too little time.

Yeah those trays can be a bugger to find. I imagine the easiest thing for me to do at this point be to use masonite, MDF, or an old case side panel and then attach all of the standoffs I need. I've been thinking over design setups for either a few more dual LGA2011 systems or maybe a small group of miniITXs next.

In my back to back MB setup I had placed a sheet of nonconductive bubble wrap between them before applying the zip ties. It left enough room that each MB would sit on either side of the wires in the rack. With MB trays you will probably have a bit more of a solid foundation to work with so that may not be an issue for you. My original plan for the setup was to have a box built so I could slide the MBs in, like cards, but I was the one that would have had to build it at work. So I went a slightly... easier route.

It looks like the E5-2665 is still quite a bargain on ebay right now. Despise the uptick in the E5-2670 pricing.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1828363 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1828634 - Posted: 5 Nov 2016, 19:26:38 UTC

I got the last bits to fully spool up the S2600CP2J with E5-2670's today. After a quick BIOS version update to add the turbo option it's cooking 32 CPU AVX tasks at 3.0GHz. The Noctura coolers seem to be keeping the CPU temps below 50ºC at the moment as well. A few hundred tasks should tell how it is going to settle in long term.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1828634 · Report as offensive
Cruncher-American Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor

Send message
Joined: 25 Mar 02
Posts: 1513
Credit: 370,893,186
RAC: 340
United States
Message 1828648 - Posted: 5 Nov 2016, 20:06:36 UTC - in response to Message 1828634.  

I got the last bits to fully spool up the S2600CP2J with E5-2670's today. After a quick BIOS version update to add the turbo option it's cooking 32 CPU AVX tasks at 3.0GHz. The Noctura coolers seem to be keeping the CPU temps below 50ºC at the moment as well. A few hundred tasks should tell how it is going to settle in long term.



If you have the time, please try it w/o HT, too. In this case, my dual E5-2670s seemed to do about the same amount of work with or w/o HT. My hypothesis is that all threads are running the same software, so whatever thread blocking occurs because of HT is at the max (?). Or, maybe, I am delusional. In any event, would be nice to have a second opinion data point. Thanks!
ID: 1828648 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1828656 - Posted: 5 Nov 2016, 20:32:23 UTC - in response to Message 1828648.  

I got the last bits to fully spool up the S2600CP2J with E5-2670's today. After a quick BIOS version update to add the turbo option it's cooking 32 CPU AVX tasks at 3.0GHz. The Noctura coolers seem to be keeping the CPU temps below 50ºC at the moment as well. A few hundred tasks should tell how it is going to settle in long term.



If you have the time, please try it w/o HT, too. In this case, my dual E5-2670s seemed to do about the same amount of work with or w/o HT. My hypothesis is that all threads are running the same software, so whatever thread blocking occurs because of HT is at the max (?). Or, maybe, I am delusional. In any event, would be nice to have a second opinion data point. Thanks!

I am planning to run several different configurations to see how it responds. I may even see how NUMA enable/disable plays out again. Last time I tested NUMA it had to noticeable difference in SETI@home processing.

One thing the dual E5645 I was running at one point did find helpful was reducing the number of threads of 24 to 20 when it was loaded with VLARs. There was a significant difference in performance and run times. It could have been from memory saturation or something similar.
Related is what I found running a bunch of i7-860 machines.
These two configurations would produce about the same amount of daily work.
i7-860 DDR3-1333 HT-on 8 threads SETI@home
i7-860 DDR3-1600 HT-on 4 threads SETI@home
I did also try this config but it did tend to produce slightly less.
i7-860 DDR3-1600 HT-off 4 threads SETI@home

Depending on how much work it can do I may spring for 2 or 3 more, but I might need try a setup E5-2670v2 before I make that call.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1828656 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1828658 - Posted: 5 Nov 2016, 20:34:09 UTC

Many years ago now, but my tests with HT at the time indicated a few percent better performance with HT disabled.

YMMV.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1828658 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1828878 - Posted: 6 Nov 2016, 15:02:06 UTC - in response to Message 1828634.  

Hey, gratz on getting it up and running, and I like my Noctura coolers too, they make pretty good stuff. That should make a significant difference in your RAC... ;-) Are you going to hook up any GPU's to it, and if so, which ones?

ID: 1828878 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1828889 - Posted: 6 Nov 2016, 16:44:43 UTC - in response to Message 1828878.  

Hey, gratz on getting it up and running, and I like my Noctura coolers too, they make pretty good stuff. That should make a significant difference in your RAC... ;-) Are you going to hook up any GPU's to it, and if so, which ones?

I ended up going with the slightly larger Noctura NH-U12DXi4 coolers once I realized how much room I really had in the case.
http://i.imgur.com/ZMqgcrB.jpg

I could easily pop in two GPUs to this system, but I don't currently have any plans to do anything crazy in regards to GPUs. Maybe a pair of RX 470's or 1050Ti's that won't make a bunch of noise.
From how 32 tasks are running the math seems to indicate that it should be able to churn though more MB work a day than my R9 390X. While using the same or less amount of power.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1828889 · Report as offensive
Previous · 1 . . . 5 · 6 · 7 · 8 · 9 · 10 · 11 . . . 12 · Next

Message boards : Number crunching : Building a 32 thread xeon system doesn't need to cost a lot


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.