Cpu's are irrevelent to cuda crunchers?


log in

Advanced search

Message boards : Number crunching : Cpu's are irrevelent to cuda crunchers?

Author Message
nemesis
Avatar
Send message
Joined: 12 Oct 99
Posts: 1408
Credit: 35,074,350
RAC: 0
Message 1111516 - Posted: 30 May 2011, 20:54:08 UTC

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?

____________

Profile SciManStevProject donor
Volunteer tester
Avatar
Send message
Joined: 20 Jun 99
Posts: 4908
Credit: 84,430,607
RAC: 30,013
United States
Message 1111523 - Posted: 30 May 2011, 21:18:42 UTC
Last modified: 30 May 2011, 21:19:26 UTC

If you mean just Cuda crunching, it matters a little. If you mean ultimate cruncher it matters a lot. My i7 CPU 980 is capable of 15,000 RAC by itself. It is heavily overclocked, and blasts through anything. Also I use my rig for much more than crunching, and having the CPU power is a real help. I can't quantify how much it contributes to my Cuda crunching, as there is some contribution. If all I am doing is crunching on the GPU's the CPU will downclock to below 3 GHz, from its normal of 4.235 GHz.

Steve
____________
Warning, addicted to SETI crunching!
Crunching as a member of GPU Users Group.
GPUUG Website

Profile ML1
Volunteer tester
Send message
Joined: 25 Nov 01
Posts: 8601
Credit: 4,262,867
RAC: 1,428
United Kingdom
Message 1111528 - Posted: 30 May 2011, 21:30:12 UTC - in response to Message 1111516.

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?

More important is the speed and amount of VRAM for the GPU and then secondly the speed of the CPU system RAM for transferring intermediate results.

If you are going to use multiple GPUs, then you'll want one physical CPU core per GPU and as many PCIe lanes as possible. Go for a motherboard that offers a minimum of PCIe x8 per GPU.

It all depends on how you balance your costs on the performance compromise. From when I looked for what I was wanting, systems using an AMD CPU and nVidia GPU gave the best performance balance.


Happy fast crunchin',
Martin

____________
See new freedom: Mageia4
Linux Voice See & try out your OS Freedom!
The Future is what We make IT (GPLv3)

Profile HelliProject donor
Volunteer tester
Avatar
Send message
Joined: 15 Dec 99
Posts: 705
Credit: 93,198,637
RAC: 65,043
Germany
Message 1111612 - Posted: 31 May 2011, 3:14:09 UTC - in response to Message 1111516.

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?


For me? Yes! As long as the Mainboard has enough PCI-E Slots. :D

Helli

Profile -= Vyper =-Project donor
Volunteer tester
Avatar
Send message
Joined: 5 Sep 99
Posts: 1098
Credit: 329,119,995
RAC: 157,142
Sweden
Message 1111658 - Posted: 31 May 2011, 6:11:03 UTC - in response to Message 1111617.


You wanna argue with me? Go ahead......and see who comes in first.
I have been at this far longer than most.

Stats don't lie.

Neither do kitties....LOL.


Hey!

Why so upset and try to persuade people to argue?!
You're totally right! I've noticed it too.
Better having the equipment running at 25% overclocked stable nonstop 24/7 than having it running unstable at 30% overclock.
The power usage is so much higher to an extent that it's not worth it powerbill wise.
Man if i'd even think about it 10 years ago that it wouldn't be worth overclocking extremely in the future due to the fact that power is costing that much more now i would've considered myself crazy! But here we are and running totaly full bore is not worth it wattage wise and stable wise!

Kind regards Vyper
____________

_________________________________________________________________________
Addicted to SETI crunching!
Founder of GPU Users Group

Profile ML1
Volunteer tester
Send message
Joined: 25 Nov 01
Posts: 8601
Credit: 4,262,867
RAC: 1,428
United Kingdom
Message 1111681 - Posted: 31 May 2011, 9:08:44 UTC - in response to Message 1111658.
Last modified: 31 May 2011, 9:10:37 UTC

You wanna argue with me? Go ahead......and see who comes in first.
I have been at this far longer than most.

Stats don't lie.

Neither do kitties....LOL.


Hey!

Why so upset and try to persuade people to argue?!
You're totally right! I've noticed it too.
Better having the equipment running at 25% overclocked stable nonstop 24/7 than having it running unstable at 30% overclock.
The power usage is so much higher to an extent that it's not worth it powerbill wise.
Man if i'd even think about it 10 years ago that it wouldn't be worth overclocking extremely in the future due to the fact that power is costing that much more now i would've considered myself crazy! But here we are and running totaly full bore is not worth it wattage wise and stable wise!

An 'enthusiastic' reply all from the hard experience of becoming wise?

I too agree from my own experiences also.

However... For the overclocking that I've done in the past, that was all on hardware that was a lot more scarce and expensive than what we have available now. Overclocking had the potential for much greater impact and effect than for what we enjoy now.

Perhaps we're now at the point where present day machines are powerful enough. Greater and more reliable gains can be made by improving the software and the algorithms to run better on what we already have.

Perhaps the Lunatics are making greater gains for everyone, far and above anything that whatever a few overclockers could ever achieve individually.


(Kittie contributions of food, power, fur, and donations excepted! :-) )

Happy efficient fast crunchin',
Martin
____________
See new freedom: Mageia4
Linux Voice See & try out your OS Freedom!
The Future is what We make IT (GPLv3)

nemesis
Avatar
Send message
Joined: 12 Oct 99
Posts: 1408
Credit: 35,074,350
RAC: 0
Message 1111766 - Posted: 31 May 2011, 15:31:37 UTC

i notice alot of the gpu's are factory oc'd from the original spec's

the standard core clock on a gtx 460 is 675mhz...
gigabyte has one oc'd to 815mhz

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125345
____________

Profile Lint trap
Send message
Joined: 30 May 03
Posts: 871
Credit: 28,080,789
RAC: 9,873
United States
Message 1111878 - Posted: 1 Jun 2011, 1:59:24 UTC - in response to Message 1111766.
Last modified: 1 Jun 2011, 2:00:16 UTC

the standard core clock on a gtx 460 is 675mhz...
gigabyte has one oc'd to 815mhz


My Gigabyte 460 was just slightly o/c'd out of the box at 715/1430/1800 (c/s/m) and has been running fine at 822/1645/2004 24/7 for a few months.

Martin

ps my 460th post..:)

Profile arkaynProject donor
Volunteer tester
Avatar
Send message
Joined: 14 May 99
Posts: 3748
Credit: 48,777,915
RAC: 1,076
United States
Message 1111888 - Posted: 1 Jun 2011, 2:42:26 UTC

I have my 460 set at 800/1600/1800.
____________

Ianab
Volunteer tester
Send message
Joined: 11 Jun 08
Posts: 678
Credit: 12,776,477
RAC: 2,085
New Zealand
Message 1111937 - Posted: 1 Jun 2011, 6:19:38 UTC

I guess it has to make make "Some" difference.

If money is no object, then you buy the best CPU AND GPUs that you can, and get the most performance. That gets expensive though.

If you have a budget, then it's probably better to skimp on the CPU, maybe just a fast dual core, and use that to feed a couple of good GPUs.

But going too wimpy on the CPU is going to have an effect on the non-GPU part of the work unit. So matching up an old Celeron CPU and a brand new GPU isn't ideal. OK it will be the best performing Celeron out there, but it wont feed the GPU at 100%. Matching it up with a 1/2 decent 3gz Pentium dual core or AMD equivalent will be able to keep the GPU fed pretty much as well as a top of the line I7 or X6. Using the $$ you save on the cheap CPU to buy a better GPU will give you more "bang for your buck"

That's my theory anyway.

Ian

Message boards : Number crunching : Cpu's are irrevelent to cuda crunchers?

Copyright © 2014 University of California