Cpu's are irrevelent to cuda crunchers?


log in

Advanced search

Message boards : Number crunching : Cpu's are irrevelent to cuda crunchers?

Author Message
nemesis
Avatar
Send message
Joined: 12 Oct 99
Posts: 1408
Credit: 35,074,350
RAC: 0
Message 1111516 - Posted: 30 May 2011, 20:54:08 UTC

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?

____________

Profile SciManStev
Volunteer tester
Avatar
Send message
Joined: 20 Jun 99
Posts: 4792
Credit: 79,767,378
RAC: 35,949
United States
Message 1111523 - Posted: 30 May 2011, 21:18:42 UTC
Last modified: 30 May 2011, 21:19:26 UTC

If you mean just Cuda crunching, it matters a little. If you mean ultimate cruncher it matters a lot. My i7 CPU 980 is capable of 15,000 RAC by itself. It is heavily overclocked, and blasts through anything. Also I use my rig for much more than crunching, and having the CPU power is a real help. I can't quantify how much it contributes to my Cuda crunching, as there is some contribution. If all I am doing is crunching on the GPU's the CPU will downclock to below 3 GHz, from its normal of 4.235 GHz.

Steve
____________
Warning, addicted to SETI crunching!
Crunching as a member of GPU Users Group.
GPUUG Website

Profile ML1
Volunteer tester
Send message
Joined: 25 Nov 01
Posts: 8270
Credit: 4,071,566
RAC: 333
United Kingdom
Message 1111528 - Posted: 30 May 2011, 21:30:12 UTC - in response to Message 1111516.

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?

More important is the speed and amount of VRAM for the GPU and then secondly the speed of the CPU system RAM for transferring intermediate results.

If you are going to use multiple GPUs, then you'll want one physical CPU core per GPU and as many PCIe lanes as possible. Go for a motherboard that offers a minimum of PCIe x8 per GPU.

It all depends on how you balance your costs on the performance compromise. From when I looked for what I was wanting, systems using an AMD CPU and nVidia GPU gave the best performance balance.


Happy fast crunchin',
Martin

____________
See new freedom: Mageia4
Linux Voice See & try out your OS Freedom!
The Future is what We make IT (GPLv3)

Profile Helli
Volunteer tester
Avatar
Send message
Joined: 15 Dec 99
Posts: 698
Credit: 84,316,679
RAC: 73,088
Germany
Message 1111612 - Posted: 31 May 2011, 3:14:09 UTC - in response to Message 1111516.

okay, you're building the ultimate cuda cruncher.
does it even matter what cpu you put in the box?


For me? Yes! As long as the Mainboard has enough PCI-E Slots. :D

Helli

msattler
Volunteer tester
Avatar
Send message
Joined: 9 Jul 00
Posts: 38324
Credit: 560,576,983
RAC: 656,192
United States
Message 1111617 - Posted: 31 May 2011, 3:38:39 UTC

Irrelevent?
No.

Consqeuential?
Yes.

The power of the cpu has a direct impact upon the preprocessing of Cuda work before the GPU takes over.
It might be 15 seconds instead of 20.....for every WU the GPU is gonna undertake.

It adds up.

So, although the real impact of the cpu is no longer the major player it once was, it DOES make a difference.

I don't even overclock cpu's the way I used to.......it does not pay off.
It is better to have a rig run 24/7 without burping than it is to have it cough in the middle of the night puking.

Just no payback in running at the edge anymore.
OCing the GPU's is about the same.......running the F out of them until they downclock when you are not looking is a waste of resources.
I've backed that down to the point where I don't have to babysit their sorry asses all the time.
You wanna argue with me? Go ahead......and see who comes in first.
I have been at this far longer than most.

Stats don't lie.

Neither do kitties....LOL.

____________
*********************************************
Embrace your inner kitty...ya know ya wanna!

I have met a few friends in my life.
Most were cats.

Profile -= Vyper =-
Volunteer tester
Avatar
Send message
Joined: 5 Sep 99
Posts: 1039
Credit: 302,041,301
RAC: 166,997
Sweden
Message 1111658 - Posted: 31 May 2011, 6:11:03 UTC - in response to Message 1111617.


You wanna argue with me? Go ahead......and see who comes in first.
I have been at this far longer than most.

Stats don't lie.

Neither do kitties....LOL.


Hey!

Why so upset and try to persuade people to argue?!
You're totally right! I've noticed it too.
Better having the equipment running at 25% overclocked stable nonstop 24/7 than having it running unstable at 30% overclock.
The power usage is so much higher to an extent that it's not worth it powerbill wise.
Man if i'd even think about it 10 years ago that it wouldn't be worth overclocking extremely in the future due to the fact that power is costing that much more now i would've considered myself crazy! But here we are and running totaly full bore is not worth it wattage wise and stable wise!

Kind regards Vyper
____________

_________________________________________________________________________
Addicted to SETI crunching!
Founder of GPU Users Group

Profile ML1
Volunteer tester
Send message
Joined: 25 Nov 01
Posts: 8270
Credit: 4,071,566
RAC: 333
United Kingdom
Message 1111681 - Posted: 31 May 2011, 9:08:44 UTC - in response to Message 1111658.
Last modified: 31 May 2011, 9:10:37 UTC

You wanna argue with me? Go ahead......and see who comes in first.
I have been at this far longer than most.

Stats don't lie.

Neither do kitties....LOL.


Hey!

Why so upset and try to persuade people to argue?!
You're totally right! I've noticed it too.
Better having the equipment running at 25% overclocked stable nonstop 24/7 than having it running unstable at 30% overclock.
The power usage is so much higher to an extent that it's not worth it powerbill wise.
Man if i'd even think about it 10 years ago that it wouldn't be worth overclocking extremely in the future due to the fact that power is costing that much more now i would've considered myself crazy! But here we are and running totaly full bore is not worth it wattage wise and stable wise!

An 'enthusiastic' reply all from the hard experience of becoming wise?

I too agree from my own experiences also.

However... For the overclocking that I've done in the past, that was all on hardware that was a lot more scarce and expensive than what we have available now. Overclocking had the potential for much greater impact and effect than for what we enjoy now.

Perhaps we're now at the point where present day machines are powerful enough. Greater and more reliable gains can be made by improving the software and the algorithms to run better on what we already have.

Perhaps the Lunatics are making greater gains for everyone, far and above anything that whatever a few overclockers could ever achieve individually.


(Kittie contributions of food, power, fur, and donations excepted! :-) )

Happy efficient fast crunchin',
Martin
____________
See new freedom: Mageia4
Linux Voice See & try out your OS Freedom!
The Future is what We make IT (GPLv3)

nemesis
Avatar
Send message
Joined: 12 Oct 99
Posts: 1408
Credit: 35,074,350
RAC: 0
Message 1111766 - Posted: 31 May 2011, 15:31:37 UTC

i notice alot of the gpu's are factory oc'd from the original spec's

the standard core clock on a gtx 460 is 675mhz...
gigabyte has one oc'd to 815mhz

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125345
____________

Profile Lint trap
Send message
Joined: 30 May 03
Posts: 859
Credit: 25,826,970
RAC: 13,083
United States
Message 1111878 - Posted: 1 Jun 2011, 1:59:24 UTC - in response to Message 1111766.
Last modified: 1 Jun 2011, 2:00:16 UTC

the standard core clock on a gtx 460 is 675mhz...
gigabyte has one oc'd to 815mhz


My Gigabyte 460 was just slightly o/c'd out of the box at 715/1430/1800 (c/s/m) and has been running fine at 822/1645/2004 24/7 for a few months.

Martin

ps my 460th post..:)

Profile arkayn
Volunteer tester
Avatar
Send message
Joined: 14 May 99
Posts: 3594
Credit: 47,349,115
RAC: 1,276
United States
Message 1111888 - Posted: 1 Jun 2011, 2:42:26 UTC

I have my 460 set at 800/1600/1800.
____________

Ianab
Volunteer tester
Send message
Joined: 11 Jun 08
Posts: 651
Credit: 11,902,151
RAC: 11,342
New Zealand
Message 1111937 - Posted: 1 Jun 2011, 6:19:38 UTC

I guess it has to make make "Some" difference.

If money is no object, then you buy the best CPU AND GPUs that you can, and get the most performance. That gets expensive though.

If you have a budget, then it's probably better to skimp on the CPU, maybe just a fast dual core, and use that to feed a couple of good GPUs.

But going too wimpy on the CPU is going to have an effect on the non-GPU part of the work unit. So matching up an old Celeron CPU and a brand new GPU isn't ideal. OK it will be the best performing Celeron out there, but it wont feed the GPU at 100%. Matching it up with a 1/2 decent 3gz Pentium dual core or AMD equivalent will be able to keep the GPU fed pretty much as well as a top of the line I7 or X6. Using the $$ you save on the cheap CPU to buy a better GPU will give you more "bang for your buck"

That's my theory anyway.

Ian

Message boards : Number crunching : Cpu's are irrevelent to cuda crunchers?

Copyright © 2014 University of California