Finally! GPU wars 2012 - GTX 650 Ti reviews

Message boards : Number crunching : Finally! GPU wars 2012 - GTX 650 Ti reviews
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

AuthorMessage
Profile John Clark
Volunteer tester
Avatar

Send message
Joined: 29 Sep 99
Posts: 16515
Credit: 4,418,829
RAC: 0
United Kingdom
Message 1218375 - Posted: 14 Apr 2012, 19:58:27 UTC

The AMD Radeon HD7990will be released to the wild on 17th April @ $849

Expensive ... I wonder if there are takers?
It's good to be back amongst friends and colleagues



ID: 1218375 · Report as offensive
.clair.

Send message
Joined: 4 Nov 04
Posts: 1300
Credit: 55,390,408
RAC: 69
United Kingdom
Message 1218550 - Posted: 15 Apr 2012, 2:29:13 UTC - in response to Message 1218375.  

The AMD Radeon HD7990will be released to the wild on 17th April @ $849

Expensive ... I wonder if there are takers?


Not me,
It will not fit in the case.
ID: 1218550 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1218565 - Posted: 15 Apr 2012, 2:57:26 UTC - in response to Message 1218550.  


The current high prices & lack of availability for the latest GPUs can be blamed on TSMC and their problems producing 28nm wafers.
And because production & supply is so limited for the new GPUs, the older ones continue to remain (relatively) expensive.
Don't expect things to improve until late this year. 3rd Quater at best.
TSMC news,
Same news, different source.
Grant
Darwin NT
ID: 1218565 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1222316 - Posted: 23 Apr 2012, 20:22:26 UTC
Last modified: 23 Apr 2012, 20:24:23 UTC

AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review

22 pages and I couldn't find a single AVX reference for you guys:( But I did find this on page 19:

" ... Ivy Bridge is the first true compute capable GPU from Intel. This marks an interesting step in the evolution of Intel's GPUs, as originally projects such as Larrabee Prime were supposed to help Intel bring together CPU and GPU computing by creating an x86 based GPU. With Larrabee Prime canceled however, that task falls to the latest rendition of Intel's GPU architecture.

With Ivy Bridge Intel will be supporting both DirectCompute 5—which is dictated by DX11—but also the more general compute focused OpenCL 1.1. Intel has backed OpenCL development for some time and currently offers an OpenCL 1.1 runtime for their CPUs, however an OpenCL runtime for Ivy Bridge will not be available at launch. As a result Ivy Bridge is limited to DirectCompute for the time being, which limits just what kind of compute performance testing we can do with Ivy Bridge."


Intel Announces 3rd Generation Core "Ivy Bridge" Processor Family

also
NVIDIA to launch 7-Billion Transistor Kepler GPGPU "Tesla" Boards on May 14?
NVIDIA GeForce GTX 690 Pictured (tba April 28)
ID: 1222316 · Report as offensive
doug
Volunteer tester

Send message
Joined: 10 Jul 09
Posts: 202
Credit: 10,828,067
RAC: 0
United States
Message 1223055 - Posted: 25 Apr 2012, 4:40:12 UTC - in response to Message 1222316.  

AnandTech - The Intel Ivy Bridge (Core i7 3770K) Review

22 pages and I couldn't find a single AVX reference for you guys:( But I did find this on page 19:

" ... Ivy Bridge is the first true compute capable GPU from Intel. This marks an interesting step in the evolution of Intel's GPUs, as originally projects such as Larrabee Prime were supposed to help Intel bring together CPU and GPU computing by creating an x86 based GPU. With Larrabee Prime canceled however, that task falls to the latest rendition of Intel's GPU architecture.

With Ivy Bridge Intel will be supporting both DirectCompute 5—which is dictated by DX11—but also the more general compute focused OpenCL 1.1. Intel has backed OpenCL development for some time and currently offers an OpenCL 1.1 runtime for their CPUs, however an OpenCL runtime for Ivy Bridge will not be available at launch. As a result Ivy Bridge is limited to DirectCompute for the time being, which limits just what kind of compute performance testing we can do with Ivy Bridge."


Intel Announces 3rd Generation Core "Ivy Bridge" Processor Family

also
NVIDIA to launch 7-Billion Transistor Kepler GPGPU "Tesla" Boards on May 14?
NVIDIA GeForce GTX 690 Pictured (tba April 28)

That's to bad. I would have thought they would have gone a bit further to support OpenCL. Larrabee was pulled, but it lives on in the Knight's whatever series. That's when things will get more interesting. That's head to head with NVidia in gpgpu computing. They better have their OpenCL act together by then. The GPU on IvyBridge, while interesting, doesn't compete with the massive independent coprocessor units of NVidia and ATI. That may change though. Getting the data to where it gets crunched is the real bottleneck.
ID: 1223055 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1223062 - Posted: 25 Apr 2012, 4:57:25 UTC - in response to Message 1223055.  
Last modified: 25 Apr 2012, 4:58:55 UTC

Getting the data to where it gets crunched is the real bottleneck.

PCIe 3 is now here. I wouldn't consider 984MB/s a bottleneck. If it is a x16 link gives almost 15.8GB/s.
Grant
Darwin NT
ID: 1223062 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1224859 - Posted: 29 Apr 2012, 11:43:38 UTC
Last modified: 29 Apr 2012, 11:44:30 UTC

ID: 1224859 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1224895 - Posted: 29 Apr 2012, 13:38:17 UTC

Kitties want........
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1224895 · Report as offensive
Dave

Send message
Joined: 29 Mar 02
Posts: 778
Credit: 25,001,396
RAC: 0
United Kingdom
Message 1224949 - Posted: 29 Apr 2012, 16:52:14 UTC

It's 2 680s on the same board with some nice features, but bare in mind with it being 680s their DP performance is going to be the same as erm a 680.
ID: 1224949 · Report as offensive
Profile DoubleTop
Volunteer tester

Send message
Joined: 6 Jul 02
Posts: 28
Credit: 8,022,048
RAC: 0
United Kingdom
Message 1226802 - Posted: 3 May 2012, 19:42:48 UTC
Last modified: 3 May 2012, 19:43:12 UTC

Release date today for the 690, price wise = dedicated cruncher for sure!

Anyone care to guess at the numbers in terms of output? It kind of appeals to kick out one card that will beat (and some) my entire old diskless seti farm from back in the day :)

https://www.aria.co.uk/Products/Components/Graphics+Cards+-+NVIDIA/GeForce+GTX+690

3072 cuda cores is just insane, that's pretty much got to be getting close to the realms of the CPU not being able to feed it - or saturating one of the transports? I dunno, but I'm excited about the card despite the cash outlet I am seriously considering a dedicated 24/7 cruncher in the datacentre rack - just got to hope I have enough headroom in AMPs by turning off the current machines.

DT.
ID: 1226802 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1227034 - Posted: 4 May 2012, 5:25:35 UTC - in response to Message 1224949.  

It's 2 680s on the same board with some nice features, but bare in mind with it being 680s their DP performance is going to be the same as erm a 680.

Seti doesn't require or make use of Double Precision, so it's not a factor.
Grant
Darwin NT
ID: 1227034 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1227036 - Posted: 4 May 2012, 5:31:33 UTC - in response to Message 1226802.  
Last modified: 4 May 2012, 5:32:23 UTC

just got to hope I have enough headroom in AMPs by turning off the current machines.

The new GPUs are very low power users compared to what went before.
A GTX690 under full load draws less power than a single GTX580 card under the same load. A single GTX690 actually draws less power than 2 GTX680s- about 85W less.
Grant
Darwin NT
ID: 1227036 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1227069 - Posted: 4 May 2012, 7:11:21 UTC - in response to Message 1227036.  

just got to hope I have enough headroom in AMPs by turning off the current machines.

The new GPUs are very low power users compared to what went before.
A GTX690 under full load draws less power than a single GTX580 card under the same load. A single GTX690 actually draws less power than 2 GTX680s- about 85W less.

OK...when the kitties win the lottery, they pledge to replace all current GPUs with 690s. Thereby saving the planet.

Don't I wish.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1227069 · Report as offensive
Profile DoubleTop
Volunteer tester

Send message
Joined: 6 Jul 02
Posts: 28
Credit: 8,022,048
RAC: 0
United Kingdom
Message 1227215 - Posted: 4 May 2012, 16:03:57 UTC - in response to Message 1227036.  

just got to hope I have enough headroom in AMPs by turning off the current machines.

The new GPUs are very low power users compared to what went before.
A GTX690 under full load draws less power than a single GTX580 card under the same load. A single GTX690 actually draws less power than 2 GTX680s- about 85W less.


current machines in there are a little old now, might be time for an end of tax year refresh If I can pull more points, less leccy usage in the 1/4 rack all the better. Just me, or does there seem to be very little point in CPU crunching now?

Not a single GPU is in the rack, all the GPU power is in offices under desks heating the rooms!
ID: 1227215 · Report as offensive
Josef W. Segur
Volunteer developer
Volunteer tester

Send message
Joined: 30 Oct 99
Posts: 4504
Credit: 1,414,761
RAC: 0
United States
Message 1227240 - Posted: 4 May 2012, 17:09:55 UTC - in response to Message 1227215.  

...
Just me, or does there seem to be very little point in CPU crunching now?
...

Unfortunately you still need a CPU, and might as well get more use out of it than simply feeding the GPU. And doing CPU crunching using mature technologies and compilers provides a very good level of quality control. My estimate is that more than 60% of the work is being done on CPU still, and they are far more reliable than GPUs overall.

But certainly anyone building a system intended mainly for crunching ought to concentrate on the GPU capability. With the GPUUG efforts and probable future improved download it may even become sensible again to aim at the most crunching capability which can be achieved in a single system, for now two or three lesser systems are more likely to be able to maintain full productivity.
                                                                   Joe
ID: 1227240 · Report as offensive
Profile arkayn
Volunteer tester
Avatar

Send message
Joined: 14 May 99
Posts: 4438
Credit: 55,006,323
RAC: 0
United States
Message 1227258 - Posted: 4 May 2012, 17:53:33 UTC

I am down to just my 3 GPU's crunching.

The GTX-460 & GTX-560 doing SETI and my ancient HD4830 doing Milkyway.

The HD7750 is the screen runner and gamer.

ID: 1227258 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1229985 - Posted: 10 May 2012, 12:40:43 UTC
Last modified: 10 May 2012, 13:14:34 UTC

ID: 1229985 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1230036 - Posted: 10 May 2012, 15:47:23 UTC

I wonder if nVidia will show up to give Seti a hand...

NVIDIA and the Folding@Home group have sent over a benchmarkable version of the client with preliminary optimizations for GK104. Folding@Home and similar initiatives are still one of the most popular consumer compute workloads, so it’s something NVIDIA wants their GPUs to do well at.




...and does so by using fewer watts than a 560Ti!:o

At 317W from the wall it’s 45W less than the GTX 680 for roughly 90% of the gaming performance, and this is in fact is lower power consumption than anything except the Radeon HD 7870. Even the GTX 560 Ti (which isn’t in this chart) is higher at 333W, reflecting the fact that GK104 is the true successor to GF114, and which would make GTX 670 the successor to GTX 560 from a design perspective.

AnandTech - Power, Temps & Noise
ID: 1230036 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1230048 - Posted: 10 May 2012, 16:27:57 UTC

GTX670s in stock at the egg....
$399.00 and free shipping...
Wish the kitties had just a bit more kibble in the bank.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1230048 · Report as offensive
Profile jason_gee
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 24 Nov 06
Posts: 7489
Credit: 91,093,184
RAC: 0
Australia
Message 1230061 - Posted: 10 May 2012, 16:48:57 UTC - in response to Message 1230036.  
Last modified: 10 May 2012, 16:51:08 UTC

I wonder if nVidia will show up to give Seti a hand...


Been working for some time quite closely with their registered development program. Einstein devs' & my own reports apparently got some bug fixes in play, and Kepler intro went smoothly as far as I'm concerned.

Newer builds (than x41g release) in testing are scaling quite nicely on Kepler GPUs thanks @ 1.5-2x 6.10 on GTX 680 ;), though indications are the GPU is still underutilised, and it'll take some time to push the architecture as far as possible.

It's looking like x41x (in closed Pre-Alpha) might be Stock Cuda multibeam V7, though only a Cuda 3.2 build for widest driver compatibility. Cuda 4.1 (at least, maybe optionally 4.2 for Keplers) will be available third party.

Still got some loose ends to tie up before pestering Eric with an updated build to install at Beta. The V7 multibeam introduction should IMO go fairly smoothly, and see some further technological jumps afterwards.

Jason
"Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions.
ID: 1230061 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

Message boards : Number crunching : Finally! GPU wars 2012 - GTX 650 Ti reviews


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.