AMD 290X vs RX 480 for seti/ DUAL NVIDIA 1070 vs Single 1080

Message boards : Number crunching : AMD 290X vs RX 480 for seti/ DUAL NVIDIA 1070 vs Single 1080
Message board moderation

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
elec999 Project Donor

Send message
Joined: 24 Nov 02
Posts: 375
Credit: 416,969,548
RAC: 141
Canada
Message 1837537 - Posted: 23 Dec 2016, 16:22:05 UTC

Good Day,
I am currently seeking to get either of these cards. Which card would be better for seti?
Cheers
ID: 1837537 · Report as offensive
qbit
Volunteer tester
Avatar

Send message
Joined: 19 Sep 04
Posts: 630
Credit: 6,868,528
RAC: 0
Austria
Message 1837544 - Posted: 23 Dec 2016, 17:29:00 UTC

No idea about the ATIs but 2x1070 should perform better then a single 1080.
ID: 1837544 · Report as offensive
Profile Zalster Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 27 May 99
Posts: 5517
Credit: 528,817,460
RAC: 242
United States
Message 1837545 - Posted: 23 Dec 2016, 17:38:38 UTC - in response to Message 1837544.  

If you go with any of the 10x0 series, make sure they have more than just a 6 pin power connector. Those few GPUs with a 6 pin seem to be underpowered by the pin. Shouldn't be a problem as almost all manufactures are now using 8 pin and/or 8 +6 or 8+8 pin connectors.
ID: 1837545 · Report as offensive
bluestar

Send message
Joined: 5 Sep 12
Posts: 7018
Credit: 2,084,789
RAC: 3
Message 1840954 - Posted: 9 Jan 2017, 4:11:55 UTC - in response to Message 1837537.  
Last modified: 9 Jan 2017, 4:15:27 UTC

Oh, perhaps something else.

Make it that of possible credit perhaps.

And not necessarily that of a given science, because most likely we already do have the answer.

Right now making a fool of myself.
ID: 1840954 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1840963 - Posted: 9 Jan 2017, 5:03:09 UTC - in response to Message 1837544.  
Last modified: 9 Jan 2017, 5:54:52 UTC

No idea about the ATIs but 2x1070 should perform better then a single 1080.

I'm not quite sure about that, comparing my 1070 to Peti's 1080's he is performing at least 2 times better.

EDIT: Maybe I'm wrong there, According to a OpenCL performance chart someone posted, 2 x 1060s should outperform a 1080.
ID: 1840963 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13732
Credit: 208,696,464
RAC: 304
Australia
Message 1840980 - Posted: 9 Jan 2017, 7:15:14 UTC - in response to Message 1840963.  
Last modified: 9 Jan 2017, 7:16:18 UTC

No idea about the ATIs but 2x1070 should perform better then a single 1080.

I'm not quite sure about that, comparing my 1070 to Peti's 1080's he is performing at least 2 times better.

You need to compare the same things- hardware, and application, and application settings.
Petrie's application is very, very custom, and his settings ultra aggressive.

You need to compare systems with a GTX 1050Ti, GTX 1060, GTX 1070, & a GTX 1080 all running the same OS, same driver, same application & equivalent command line settings. (ie ones that are best suited for the particular hardware) to get a truly accurate comparison.
I suspect a single GTX 1050Ti running Petrie's application with the appropriate tweaks would outperform a GTX 1080 running CUDA50.
Grant
Darwin NT
ID: 1840980 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1840984 - Posted: 9 Jan 2017, 7:34:32 UTC - in response to Message 1840980.  

Grant, I've been running Ubuntu with a version of the special code, it's not exactly the same but close. It's cool to see tasks finish in under 2 minutes :)
ID: 1840984 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13732
Credit: 208,696,464
RAC: 304
Australia
Message 1840985 - Posted: 9 Jan 2017, 7:43:11 UTC - in response to Message 1840984.  

Grant, I've been running Ubuntu with a version of the special code, it's not exactly the same but close. It's cool to see tasks finish in under 2 minutes :)

Equivalent command line settings?
The extra performance of the GTX 1080 could be due to the those extra Compute Units, and the special code taking advantage of them.
Grant
Darwin NT
ID: 1840985 · Report as offensive
Profile Michel Makhlouta
Volunteer tester
Avatar

Send message
Joined: 21 Dec 03
Posts: 169
Credit: 41,799,743
RAC: 0
Lebanon
Message 1840992 - Posted: 9 Jan 2017, 8:29:01 UTC

I've added 2x1070's around 10 days ago and running with default settings (no command lines, 1WU per GPU).

RAC at 32K and still climbing. If there's someone with the same settings (win10, default settings, etc...) running with a 1080, maybe we can compare...
ID: 1840992 · Report as offensive
baron_iv
Volunteer tester
Avatar

Send message
Joined: 4 Nov 02
Posts: 109
Credit: 104,905,241
RAC: 0
United States
Message 1841080 - Posted: 9 Jan 2017, 20:09:58 UTC

It all depends on what you're doing. If you want to run Nvidia, I'd recommend the 2x1070s (you can see my 2x1070 computer, it's in the top 25 of all computers, currently). If you want to go AMD, I'd recommend getting 2 Sapphire R9 Furys from Newegg, they're on sale for $239 after rebate ($259 before rebate). I bought two myself and they're amazing for crunching, I'm approaching 50k RAC with that computer, it's newer, so it hasn't had the time to build up like my 1070s system, but I don't expect it to be faster...however, it is SIGNIFICANTLY cheaper. Two 1070s will cost you around $800. You can get three Furies for less than that price and 3 furies WILL out-produce two 1070s (the two systems are neck and neck under Win10 and I think the Fury COULD be faster with some further optimizations, considering it's 7170 cores on the Furys vs under 4000 for the 2x1070s). If you have a motherboard that can do triple crossfire, you have one heck of a gaming machine too, but you will need a monstrous PSU to power three of them, so keep that in mind (I wouldn't run 3 with anything less than 1000W and even that might be too little). My main computer, (6700K, 32gb RAM and 2xR9 Fury) plays everything I throw at it at 1440p at 60fps or better and it's superb for SETI, too. If that motherboard would play nice with 3 Furys, I'd put 3 in there, but the way it's set up, I don't think three would work. The 1070s will also use less power, by quite a bit. Pascal is very power efficient. My Fury system isn't even running 24/7, because that's also my gaming and daily-use machine, so 50k RAC is pretty astonishing, imo.

That being said, I have found that my 1070s under Linux is on the order of 35-40% faster under Linux with the "special sauce" app vs the same setup under Windows 10.

TL;DR: If you go w/AMD, get 2 R9 Fury GPUs, if you want Nvidia, get 2x1070s. Personally, I wouldn't even consider 290/290X or 480 given how cheap Fury cards are right now, it's a no-brainer.
-baron_iv
Proud member of:
GPU Users Group
ID: 1841080 · Report as offensive
elec999 Project Donor

Send message
Joined: 24 Nov 02
Posts: 375
Credit: 416,969,548
RAC: 141
Canada
Message 1842876 - Posted: 18 Jan 2017, 15:25:49 UTC
Last modified: 18 Jan 2017, 15:26:11 UTC

Heres my final dilemma. I got saved up to buy two gtx 1080 or 4 rx480. Or three 1070. Also got the rx480 does the 4gb vs 8gb make big difference?
What should I do?
ID: 1842876 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1842940 - Posted: 18 Jan 2017, 23:10:04 UTC - in response to Message 1842876.  
Last modified: 18 Jan 2017, 23:18:38 UTC

Heres my final dilemma. I got saved up to buy two gtx 1080 or 4 rx480. Or three 1070. Also got the rx480 does the 4gb vs 8gb make big difference?
What should I do?

Either 4GB or 8GB will be fine for SETI@home work. The mos I have used for a SETI@home task on my R9 390X is ~400MB and that was when using command line options that increased the memory usage of the app.

If you want to go with Nvidia or Radeon you can pick between better FLOPs per Watt or FLOPs per $.
On paper the specs look something like this:
GTX 1080 8228 GFLOPs, 180W, $700 x2 = 16456 GFLOPs, 360W, $1400 or 45.71 GFLOPs/W, 11.75 GFLOPs/$
GTX 1070 5783 GFLOPs, 150W, $450 x 3 = 17349 GFLOPs, 450W, $1350 or 38.55 GFLOPs/W, 12.85 GFLOPs/$
RX 480 5161 GFLOPs, 150W, $200 x 4 = 20644 GFLOPs, 600W, $800 or 34.40 GFLOPs/W, 25.80 GFLOPs/$

In practice there isn't a direct correlation between FLOPS and doing SETI@home work. Especially between different vendors. Sometimes it isn't even useful to compare different generations of GPUs this way.
You can have a look at Shaggie's GPU FLOPS: Theory vs Reality thread to see a performance comparison using credit.

This doesn't tell the complete story for all GPUs. As there are so many different manufactures and app configuration options.
In comparing my R9 390X to hosts with a R9 Fury X and tasks with a similar AR. They are pretty much on par. With the worst case being my R9 390X about 9% slower than a hosts with highly optimized settings.

EDIT:
I was just comparing the run times of Michel Makhlouta's 1070's to my R9 390X. It looks like my 390 completes similar tasks bout 25% faster and the 390 is only rated about 2% higher in GFLOPS. GTX 1070 - 5783 GFLOPs vs R9 390X - 5913 GFLOPs
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1842940 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13732
Credit: 208,696,464
RAC: 304
Australia
Message 1842977 - Posted: 19 Jan 2017, 5:01:26 UTC - in response to Message 1842940.  

I was just comparing the run times of Michel Makhlouta's 1070's to my R9 390X. It looks like my 390 completes similar tasks bout 25% faster and the 390 is only rated about 2% higher in GFLOPS. GTX 1070 - 5783 GFLOPs vs R9 390X - 5913 GFLOPs

How well the application is able to make use of the hardware is what really determines whether your card will produce a lot of work or not. Doesn't matter how much capability the card my have, if the software can't take advantage of it.
Grant
Darwin NT
ID: 1842977 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1842983 - Posted: 19 Jan 2017, 5:21:42 UTC - in response to Message 1842977.  

I was just comparing the run times of Michel Makhlouta's 1070's to my R9 390X. It looks like my 390 completes similar tasks bout 25% faster and the 390 is only rated about 2% higher in GFLOPS. GTX 1070 - 5783 GFLOPs vs R9 390X - 5913 GFLOPs

How well the application is able to make use of the hardware is what really determines whether your card will produce a lot of work or not. Doesn't matter how much capability the card my have, if the software can't take advantage of it.

Indeed. I added that to emphasize the comment I made a few lines before that.
In practice there isn't a direct correlation between FLOPS and doing SETI@home work. Especially between different vendors. Sometimes it isn't even useful to compare different generations of GPUs this way.

SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1842983 · Report as offensive
Profile M_M
Avatar

Send message
Joined: 20 May 04
Posts: 76
Credit: 45,752,966
RAC: 8
Serbia
Message 1842988 - Posted: 19 Jan 2017, 6:39:22 UTC
Last modified: 19 Jan 2017, 6:41:44 UTC

If you are building a SETI dedicated cruncher, at this moment, best performance/watt and performance/$ will give you 2xGTX1070 on Linux with optimized Cuda apps... No such highly optimized apps available for AMD.
ID: 1842988 · Report as offensive
Profile Shaggie76
Avatar

Send message
Joined: 9 Oct 09
Posts: 282
Credit: 271,858,118
RAC: 196
Canada
Message 1843094 - Posted: 19 Jan 2017, 20:25:07 UTC

Perhaps these graphs can help you make up your mind -- they certainly convinced me to put 1070's in all my new builds but I'll let you make up your own mind.
ID: 1843094 · Report as offensive
elec999 Project Donor

Send message
Joined: 24 Nov 02
Posts: 375
Credit: 416,969,548
RAC: 141
Canada
Message 1843098 - Posted: 19 Jan 2017, 21:07:49 UTC

Now that I am convinced on Nvidia. Should I still get 4 1070 or go with 2 1080s? I am also thinking of space. Easier to install 1080 then 4 1070. Will the 2 1080 be 50% slower?
ID: 1843098 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13732
Credit: 208,696,464
RAC: 304
Australia
Message 1843157 - Posted: 20 Jan 2017, 4:09:14 UTC - in response to Message 1843098.  

Now that I am convinced on Nvidia. Should I still get 4 1070 or go with 2 1080s? I am also thinking of space. Easier to install 1080 then 4 1070. Will the 2 1080 be 50% slower?

4* GTX 1070s will out-produce 2* GTX 1080s. But if you don't have the physical space, CPU cores or power supply necessary to drive 4 cards then go with the 2* GTX 1080s.
Grant
Darwin NT
ID: 1843157 · Report as offensive
elec999 Project Donor

Send message
Joined: 24 Nov 02
Posts: 375
Credit: 416,969,548
RAC: 141
Canada
Message 1845046 - Posted: 29 Jan 2017, 3:10:12 UTC - in response to Message 1842940.  
Last modified: 29 Jan 2017, 3:12:40 UTC

Heres my final dilemma. I got saved up to buy two gtx 1080 or 4 rx480. Or three 1070. Also got the rx480 does the 4gb vs 8gb make big difference?
What should I do?

Either 4GB or 8GB will be fine for SETI@home work. The mos I have used for a SETI@home task on my R9 390X is ~400MB and that was when using command line options that increased the memory usage of the app.

If you want to go with Nvidia or Radeon you can pick between better FLOPs per Watt or FLOPs per $.
On paper the specs look something like this:
GTX 1080 8228 GFLOPs, 180W, $700 x2 = 16456 GFLOPs, 360W, $1400 or 45.71 GFLOPs/W, 11.75 GFLOPs/$
GTX 1070 5783 GFLOPs, 150W, $450 x 3 = 17349 GFLOPs, 450W, $1350 or 38.55 GFLOPs/W, 12.85 GFLOPs/$
RX 480 5161 GFLOPs, 150W, $200 x 4 = 20644 GFLOPs, 600W, $800 or 34.40 GFLOPs/W, 25.80 GFLOPs/$

In practice there isn't a direct correlation between FLOPS and doing SETI@home work. Especially between different vendors. Sometimes it isn't even useful to compare different generations of GPUs this way.
You can have a look at Shaggie's GPU FLOPS: Theory vs Reality thread to see a performance comparison using credit.

This doesn't tell the complete story for all GPUs. As there are so many different manufactures and app configuration options.
In comparing my R9 390X to hosts with a R9 Fury X and tasks with a similar AR. They are pretty much on par. With the worst case being my R9 390X about 9% slower than a hosts with highly optimized settings.

EDIT:
I was just comparing the run times of Michel Makhlouta's 1070's to my R9 390X. It looks like my 390 completes similar tasks bout 25% faster and the 390 is only rated about 2% higher in GFLOPS. GTX 1070 - 5783 GFLOPs vs R9 390X - 5913 GFLOPs


You looking at the cards, from your PC vs Michaels
46,247.89 [2] NVIDIA GeForce GTX 1070 (4095MB) driver: 376.33 OpenCL: 1.2
/2
23,123.945 about per card
vs
AMD Radeon R9 390X (Grenada XT) (8192MB) driver: 1912.5 OpenCL: 2.0
21,884.42

The amd card is very close the nvidia card, but the amd card costs about 60% less or am I missing something here?
ID: 1845046 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13732
Credit: 208,696,464
RAC: 304
Australia
Message 1845052 - Posted: 29 Jan 2017, 4:14:13 UTC - in response to Message 1845046.  

The amd card is very close the nvidia card, but the amd card costs about 60% less or am I missing something here?

The R9 390X appears to be rated for around 250W. The GTX 1070 is rated at 150W, up to 180W for overclocked versions.
I'd rather pay more up front for a card that will cost a lot less to run, than pay less up front for a card that will cost a lot more to run. Particularly when they produce similar amounts of work.
Grant
Darwin NT
ID: 1845052 · Report as offensive
1 · 2 · Next

Message boards : Number crunching : AMD 290X vs RX 480 for seti/ DUAL NVIDIA 1070 vs Single 1080


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.