Special App and Kepler Architecture

Message boards : Number crunching : Special App and Kepler Architecture
Message board moderation

To post messages, you must log in.

1 · 2 · 3 · Next

AuthorMessage
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978965 - Posted: 6 Feb 2019, 20:13:02 UTC

I picked up a GTX 690 for cheap, this is a dual GPU card i'm using on a old machine and when I tried to use the special app all the workloads failed with errors. Does the Special App support the older Kepler architecture?

JP
ID: 1978965 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 9877
Credit: 931,434,587
RAC: 1,517,523
United States
Message 1978967 - Posted: 6 Feb 2019, 20:17:49 UTC - in response to Message 1978965.  

No the special app requires CC capability of 5.0 and the GTX 690 only has 3.0 CC capability.
https://developer.nvidia.com/cuda-gpus
Seti@Home classic workunits:20,676 CPU time:74,226 hours
ID: 1978967 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978974 - Posted: 6 Feb 2019, 20:53:54 UTC - in response to Message 1978967.  

oh darn, I was hoping I found some cheap GPU's that would perform well. :/

They sure do beat the two GTs 8800's I had in there though lol
ID: 1978974 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 17796
Credit: 407,180,995
RAC: 140,313
United Kingdom
Message 1978976 - Posted: 6 Feb 2019, 20:58:12 UTC

...at a price on the energy bill :-(
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1978976 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978981 - Posted: 6 Feb 2019, 21:26:19 UTC - in response to Message 1978976.  

Power usage for GTX 690 115 watts +/- some, but add 2,3,4 cards and it'll add up fast. A GTX1080 is around 175 watts +/- some, but the "Performance per Watt" appears to be better with the GTX1080 definitely. (loosely based on internet data)

It would be interesting to actually graph and measure the power consumption side by side with the Avg Credits of each system.
ID: 1978981 · Report as offensive
Profile -= Vyper =-
Volunteer tester
Avatar

Send message
Joined: 5 Sep 99
Posts: 1582
Credit: 862,388,555
RAC: 1,176,506
Sweden
Message 1978982 - Posted: 6 Feb 2019, 21:28:39 UTC - in response to Message 1978981.  

http://setiathome.berkeley.edu/show_host_detail.php?hostid=8600449

2080Ti = 145000 Rac/day atm
System with a 2080Ti that has got its power reduced to 185W max! No cpu work at all only GPU.

_________________________________________________________________________
Addicted to SETI crunching!
Founder of GPU Users Group
ID: 1978982 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 1808
Credit: 772,667,520
RAC: 2,610,681
United States
Message 1978985 - Posted: 6 Feb 2019, 21:38:34 UTC - in response to Message 1978981.  

Power usage for GTX 690 115 watts +/- some, but add 2,3,4 cards and it'll add up fast. A GTX1080 is around 175 watts +/- some, but the "Performance per Watt" appears to be better with the GTX1080 definitely. (loosely based on internet data)

It would be interesting to actually graph and measure the power consumption side by side with the Avg Credits of each system.


the GTX 690 has a TDP of 300W, not sure where you got the 115w value from. it will absolutely use more power than a GTX 1080.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1978985 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978988 - Posted: 6 Feb 2019, 21:54:47 UTC - in response to Message 1978985.  
Last modified: 6 Feb 2019, 21:59:47 UTC

Toms hardware is where they measured the power consumption that I looked at, I do not know if it's under full load - probably not, and it was measured under Windows.

I would LOVE to buy 1080's and 2080's all day but the money those cost is not in my personal budget and this is for fun. It was replacing old 8800's and only $120 off eBay. It would be different if it was for crypto or something else that produced an income.

https://www.tomshardware.com/reviews/geforce-gtx-690-benchmark,3193-14.html

GTX 690 Dual GPU Engine Specs:
3072CUDA Cores.
915Base Clock (MHz)
1019Boost Clock (MHz)
234Texture Fill Rate (billion/sec)
6.0Memory Speed (Gbps)
4096 MB (2048 MB per GPU) GDDR5Standard Memory Config.
512-bit (256-bit per GPU)Memory Interface Width.
384Memory Bandwidth (GB/sec)
ID: 1978988 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 1808
Credit: 772,667,520
RAC: 2,610,681
United States
Message 1978990 - Posted: 6 Feb 2019, 22:02:29 UTC

for $120, you could have bought a GTX 1060 3GB, which would use ~100w on SETI, and still be faster than the 690 since it can use the latest special app.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1978990 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978992 - Posted: 6 Feb 2019, 22:04:04 UTC

Thanks again Keith for updating me on the Compute 3.0 of the older cards, Ill look for newer cars that are 5.0 compatible. I really like the dual GPU cards but the Titan Z is still pricey.

JP
ID: 1978992 · Report as offensive
Profile -= Vyper =-
Volunteer tester
Avatar

Send message
Joined: 5 Sep 99
Posts: 1582
Credit: 862,388,555
RAC: 1,176,506
Sweden
Message 1978993 - Posted: 6 Feb 2019, 22:08:58 UTC - in response to Message 1978992.  

Thanks again Keith for updating me on the Compute 3.0 of the older cards, Ill look for newer cars that are 5.0 compatible. I really like the dual GPU cards but the Titan Z is still pricey.

JP


Here also. https://en.wikipedia.org/wiki/CUDA#GPUs_supported

_________________________________________________________________________
Addicted to SETI crunching!
Founder of GPU Users Group
ID: 1978993 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978994 - Posted: 6 Feb 2019, 22:10:31 UTC - in response to Message 1978990.  

I looked at the 1060's but the cuda cores were only 1280 and the GTX 690 is a dual GPU with 3072 cuda cores and a 512 memory bus, I was unaware of the compute compatibility for the special app for the older card.

CUDA Cores
1280
Graphics Clock (MHz)
1506
Processor Clock (MHz)
1708
Graphics Performance
high-11048
ID: 1978994 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1978995 - Posted: 6 Feb 2019, 22:15:06 UTC - in response to Message 1978993.  

Thanks again Keith for updating me on the Compute 3.0 of the older cards, Ill look for newer cars that are 5.0 compatible. I really like the dual GPU cards but the Titan Z is still pricey.

JP


Here also. https://en.wikipedia.org/wiki/CUDA#GPUs_supported


Thanks Viper, That will help for future purchases
ID: 1978995 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 1808
Credit: 772,667,520
RAC: 2,610,681
United States
Message 1978996 - Posted: 6 Feb 2019, 22:18:03 UTC - in response to Message 1978994.  

I looked at the 1060's but the cuda cores were only 1280 and the GTX 690 is a dual GPU with 3072 cuda cores and a 512 memory bus, I was unaware of the compute compatibility for the special app for the older card.

CUDA Cores
1280
Graphics Clock (MHz)
1506
Processor Clock (MHz)
1708
Graphics Performance
high-11048


you can't compare raw cuda core count

1. across different architectures
2. when using different apps

the Pascal architecture of the 10-series cards is 2 generations newer than Kepler, and is leaps and bounds more power efficient.

the CUDA Special app by petri is much more optimized than the OpenCL apps that you'd be limited to using the 690. the special app is about 3x faster. a single 1060 can yield about 40-50k RAC.

cuda counts aren't the whole story
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1978996 · Report as offensive
Profile Wiggo "Democratic Socialist"
Avatar

Send message
Joined: 24 Jan 00
Posts: 16810
Credit: 230,948,336
RAC: 168,148
Australia
Message 1978997 - Posted: 6 Feb 2019, 22:19:46 UTC
Last modified: 6 Feb 2019, 22:22:59 UTC

I don't know where you got 115W for a GTX 690 as Nvidia rate it at 300W max draw.

But then compare this rig with a 690 to this rig with 2x 1060's with both running Win7.

One thing that I did notice is that there arn't as many GTX 690's around these days as there use to be.

[edit] Nvidia rate the 1060's at 120W max draw, but mine rarely pull above 80W while crunching.

Cheers.
ID: 1978997 · Report as offensive
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 1808
Credit: 772,667,520
RAC: 2,610,681
United States
Message 1978999 - Posted: 6 Feb 2019, 22:24:34 UTC - in response to Message 1978997.  

from the link he posted, which showed the system power draw (not GPU only), but this was also at IDLE, ie, not doing anything but displaying a desktop.

a little further down they have a graph of the system under load, showing ~400W.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 1978999 · Report as offensive
Profile Wiggo "Democratic Socialist"
Avatar

Send message
Joined: 24 Jan 00
Posts: 16810
Credit: 230,948,336
RAC: 168,148
Australia
Message 1979000 - Posted: 6 Feb 2019, 22:29:24 UTC

from the link he posted, which showed the system power draw (not GPU only), but this was also at IDLE, ie, not doing anything but displaying a desktop.

a little further down they have a graph of the system under load, showing ~400W.
My slow posting as I hit the reply button after Vyper's post so I missed a few posts in that time. ;-)

Cheers.
ID: 1979000 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1979003 - Posted: 6 Feb 2019, 22:41:57 UTC - in response to Message 1978999.  

I don't know where you got 115W for a GTX 690 as Nvidia rate it at 300W max draw.

But then compare this rig with a 690 to this rig with 2x 1060's with both running Win7.

One thing that I did notice is that there arn't as many GTX 690's around these days as there use to be.

[edit] Nvidia rate the 1060's at 120W max draw, but mine rarely pull above 80W while crunching.

Cheers.


I just did a quick search for power consumption for the 690 and literally clicked the first link I saw, I was not concerned with power use as I was only planning on running a couple of these 690 cards at most.



from the link he posted, which showed the system power draw (not GPU only), but this was also at IDLE, ie, not doing anything but displaying a desktop.

a little further down they have a graph of the system under load, showing ~400W.


Thanks Ian, I do realize now that at max power draw I could run more 1060's per power supply vs the 690's I was going to use. I'll certainly go another route now -
ID: 1979003 · Report as offensive
Profile J3P-0
Avatar

Send message
Joined: 1 Dec 11
Posts: 42
Credit: 19,106,666
RAC: 15,462
United States
Message 1979004 - Posted: 6 Feb 2019, 22:46:36 UTC - in response to Message 1978996.  
Last modified: 6 Feb 2019, 22:46:51 UTC

I looked at the 1060's but the cuda cores were only 1280 and the GTX 690 is a dual GPU with 3072 cuda cores and a 512 memory bus, I was unaware of the compute compatibility for the special app for the older card.

CUDA Cores
1280
Graphics Clock (MHz)
1506
Processor Clock (MHz)
1708
Graphics Performance
high-11048


you can't compare raw cuda core count

1. across different architectures
2. when using different apps

the Pascal architecture of the 10-series cards is 2 generations newer than Kepler, and is leaps and bounds more power efficient.

the CUDA Special app by petri is much more optimized than the OpenCL apps that you'd be limited to using the 690. the special app is about 3x faster. a single 1060 can yield about 40-50k RAC.

cuda counts aren't the whole story


I gotta ask Ian, are you really running 63 GPU's ?

[63] NVIDIA GeForce GTX 1080 Ti (4095MB) driver: 410.66
ID: 1979004 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 11658
Credit: 174,111,821
RAC: 119,474
Australia
Message 1979006 - Posted: 6 Feb 2019, 22:47:23 UTC - in response to Message 1978994.  

I looked at the 1060's but the cuda cores were only 1280 and the GTX 690 is a dual GPU with 3072 cuda cores and a 512 memory bus,

Some things to keep in mind for future purchases.

It's not just the number of cores, but the type of cores. There have been a lot of architectural improvements since the GTX 600 series. And even with a wider memory bus, dual GPUs on a single card tend to be at a disadvantage when it comes to memory bandwidth compared to a single GPU of the same type, even if it has a narrower memory bus (the memory bus on the GTX 690 is really 256bit. One 256 bit bus for each GPU= 512bit in marketing speak). And once again, there have ben considerable improvements over the years since the GTX 600 series came out.

The GTX 690 is 2*GTX 680s, but with GTX 670 clock speeds.

Looking at Shaggie's graphs, the GTX 1060 3GB puts out 400+ credits per hour (stock)- at 120W
A GTX 680 around 300, so the GTX 690 would be around 600- at 300W
This is with Windows on the SoG application. Running LINUX and the Special Application the GTX 1060 3GB would produce 3-4times as much credit, for 120W or less.
(the GTX 1050Ti puts out the same amount of work as a GTX 680, but for less power. For the same power usage as 1 GTX 690, you could run 4 GTX 1050Tis and put out more than double the work).

Shaggie's graphs show the work produced for the power used, the GTX 680 is one of the poorest performers (the GTX 690 would rate even lower). The GTX 1060 3GB is #10 in the top ten for efficiency (although it will eventually get bumped lower now the RTX 2060 has been released).
A more recent card might cost a lot more upfront to buy, but it will cost a lot, lot less to run.
Grant
Darwin NT
ID: 1979006 · Report as offensive
1 · 2 · 3 · Next

Message boards : Number crunching : Special App and Kepler Architecture


 
©2019 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.