Finally! GPU wars 2012 - GTX 650 Ti reviews

Message boards : Number crunching : Finally! GPU wars 2012 - GTX 650 Ti reviews
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · Next

AuthorMessage
Profile ivan
Volunteer tester
Avatar

Send message
Joined: 5 Mar 01
Posts: 783
Credit: 348,560,338
RAC: 223
United Kingdom
Message 1230904 - Posted: 12 May 2012, 10:41:21 UTC - in response to Message 1230877.  

Apple co-founder Woz weighs in against tech giant on price discrimination

Should this question be applied world wide?

I can't speak for 'world wide', but certainly Australians have been gouged on prices (at least for electronics) for a very long time.

Surely not! I only paid $2000 for my dual-floppy drive. Mind you, that was tax-exempt as I was "exporting" it to Antarctica. In 1979...
ID: 1230904 · Report as offensive
Profile razamatraz

Send message
Joined: 23 Oct 07
Posts: 142
Credit: 27,815,748
RAC: 0
Canada
Message 1231153 - Posted: 12 May 2012, 19:49:16 UTC
Last modified: 12 May 2012, 19:59:24 UTC

@BENT, I suspect having SLI on is why it is using so much memory. I haven;t gone to more than 2 units yet, but when running 2 only about 480 megs of memory is used on mine.

Edit: Verified, I can easily run 4 on each card. The card with no displays is using just under 1 GB and the one with the displays is using 1100 MB

Edit 2: Although it handles it, one of the units on the display card seems to struggle. I'm dropping back to 2 units per card for a bit to test before I go to 3.
ID: 1231153 · Report as offensive
-BeNt-
Avatar

Send message
Joined: 17 Oct 99
Posts: 1234
Credit: 10,116,112
RAC: 0
United States
Message 1231485 - Posted: 13 May 2012, 10:01:15 UTC - in response to Message 1231153.  
Last modified: 13 May 2012, 10:02:34 UTC

@BENT, I suspect having SLI on is why it is using so much memory. I haven;t gone to more than 2 units yet, but when running 2 only about 480 megs of memory is used on mine.

Edit: Verified, I can easily run 4 on each card. The card with no displays is using just under 1 GB and the one with the displays is using 1100 MB

Edit 2: Although it handles it, one of the units on the display card seems to struggle. I'm dropping back to 2 units per card for a bit to test before I go to 3.


Nah wasn't it I don't think. I'm running 3 on each card and it for some reason has straightened itself out at 1660mb with 3 units on each card. (6 total) 3 is the max on these card because running just 3 on each card has both of them pegged at 90%+ usage. Have no clue what was happening before.

(Keep in mind here with SLi on the video memory is mirrored between cards, least with games it works that way.)
Traveling through space at ~67,000mph!
ID: 1231485 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1241764 - Posted: 5 Jun 2012, 20:27:42 UTC
Last modified: 5 Jun 2012, 20:28:27 UTC

ID: 1241764 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1242108 - Posted: 6 Jun 2012, 10:16:03 UTC
Last modified: 6 Jun 2012, 10:17:06 UTC

Links for Laptop Crunchers:

AnandTech - NVIDIA GeForce GTX 680M: Kepler GK104 Goes Mobile
geforce.com - Introducing The GeForce GTX 680M Mobile GPU
notebookcheck.net - NVIDIA GeForce GTX 680M
Hot Hardware - Nvidia Announces GeForce GTX 680M, Unveils Obtuse Product SKUs
"In the past, we've criticized AMD for choosing to split the HD 7000 family (both desktop and mobile) between 40nm, rebranded HD 6000 hardware and new 28nm parts built on the company's Graphics Core Next. Unfortunately, Nvidia decided to one-up its competitor in the worst way possible. When AMD split the HD 7000 Mobility family, it did so by model number -- HD 7700 - HD 7900 cards all use 28nm technology, everything below that is built on 40nm.
Nvidia, in contrast, is all over the map. The GTX 680M is a 28nm Kepler part. The GTX 675M and 670M are both 40nm Fermi, while the 660M, 650M, and 640M are all Kepler. The 640M LE family is listed as both Kepler (28nm) and Fermi (40nm) while the 635M, 630M, and 620M include 28nm Fermi options as well as the standard 40nm flavor."

-Hot Hardware


Intel Thunderbolt (you know... in case we're ever able to affordably plug an external GPU into a laptop for crunching)
Intel Thunderbolt Briefing at Computex - New motherboards and devices!

And a couple of GK110 whitepapers for the ubergeeks:)
NVIDIA® KEPLER GK110 NEXT-GENERATION CUDA® COMPUTE ARCHITECTURE
Whitepaper - NVIDIA’s Next Generation CUDA Compute Architecture: Kepler GK110
ID: 1242108 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13727
Credit: 208,696,464
RAC: 304
Australia
Message 1242111 - Posted: 6 Jun 2012, 10:42:39 UTC - in response to Message 1242108.  

And a couple of GK110 whitepapers for the ubergeeks:)

Hyper-Q looks very promising.
Grant
Darwin NT
ID: 1242111 · Report as offensive
W-K 666 Project Donor
Volunteer tester

Send message
Joined: 18 May 99
Posts: 19048
Credit: 40,757,560
RAC: 67
United Kingdom
Message 1245758 - Posted: 14 Jun 2012, 6:15:31 UTC

Do we know when the 650 and 660 GPU's will be released?
ID: 1245758 · Report as offensive
Profile arkayn
Volunteer tester
Avatar

Send message
Joined: 14 May 99
Posts: 4438
Credit: 55,006,323
RAC: 0
United States
Message 1245764 - Posted: 14 Jun 2012, 6:34:50 UTC - in response to Message 1245758.  

Do we know when the 650 and 660 GPU's will be released?


I have heard rumors of the 25th for the 660.

ID: 1245764 · Report as offensive
W-K 666 Project Donor
Volunteer tester

Send message
Joined: 18 May 99
Posts: 19048
Credit: 40,757,560
RAC: 67
United Kingdom
Message 1245801 - Posted: 14 Jun 2012, 8:21:57 UTC - in response to Message 1245764.  

Do we know when the 650 and 660 GPU's will be released?


I have heard rumors of the 25th for the 660.

That would be good, I have one son and friend of other son, requiring new GPU's mainly games players, but cannot afford 670 or better. At right price I might just replace mine if I can do it with present PSU.
ID: 1245801 · Report as offensive
Profile shizaru
Volunteer tester
Avatar

Send message
Joined: 14 Jun 04
Posts: 1130
Credit: 1,967,904
RAC: 0
Greece
Message 1245814 - Posted: 14 Jun 2012, 10:57:28 UTC - in response to Message 1245758.  
Last modified: 14 Jun 2012, 11:25:32 UTC

Do we know when the 650 and 660 GPU's will be released?


Nobody's got a clue!:) They don't even know if it's going to be a 660 AND a 660Ti, or just one of the two. The Acer leak about the 25th could be an OEM unfortunately. Hope I'm wrong... A few others are claiming an August timeframe.

One thing everybody seems to agree on is the $300 price tag. So maybe ~260 quid?

Edit: If you are talking about the PC with GT 240 in it, that card is 70W. So you might want to have a look at the GT 640 which is already out.
ID: 1245814 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13727
Credit: 208,696,464
RAC: 304
Australia
Message 1245820 - Posted: 14 Jun 2012, 11:18:17 UTC - in response to Message 1245801.  

At right price I might just replace mine if I can do it with present PSU.

The new series of GPUs use considerably less power than the previous series.
From memory, at full load the GTX680 uses 100W less than the GTX580.

Ah, here we go,

Grant
Darwin NT
ID: 1245820 · Report as offensive
tbret
Volunteer tester
Avatar

Send message
Joined: 28 May 99
Posts: 3380
Credit: 296,162,071
RAC: 40
United States
Message 1246074 - Posted: 14 Jun 2012, 19:57:02 UTC - in response to Message 1245814.  



Edit: If you are talking about the PC with GT 240 in it, that card is 70W. So you might want to have a look at the GT 640 which is already out.



I'm fooling around with a GT640 now.

My advice is to hold-off a little while and let me get some good numbers. I've got numbers, but I have little confidence in them.

Don't quote me, but it initially looked like a single, unexceptional, WU run under the Lunatics' app was completing in the 30 minute range, give-or-take 10%.

The GT640 is *not* going to blow anyone away, even after some setup optimization.

I'm not saying it's a bad card. I'm saying it isn't a miracle.

First impression is that it will replace the GT240 and might be as much as 25% better at crunching, but that "25%" number is subject to broad revision.

I've been changing some things on it and I've pushed it too hard. <I have not tried overclocking it; that's not the setting I'm fooling-with> It's been producing errors. I've just dropped-back from 2 WUs at a time to 1 WU at a time. If I continue to get errors, I'll back-off something else.

If you go looking at my machines, just understand that whatever you see there for the GT640 is subject to a lot of change.
ID: 1246074 · Report as offensive
W-K 666 Project Donor
Volunteer tester

Send message
Joined: 18 May 99
Posts: 19048
Credit: 40,757,560
RAC: 67
United Kingdom
Message 1246304 - Posted: 15 Jun 2012, 5:07:20 UTC - in response to Message 1245814.  
Last modified: 15 Jun 2012, 5:15:56 UTC

Do we know when the 650 and 660 GPU's will be released?


Nobody's got a clue!:) They don't even know if it's going to be a 660 AND a 660Ti, or just one of the two. The Acer leak about the 25th could be an OEM unfortunately. Hope I'm wrong... A few others are claiming an August timeframe.

One thing everybody seems to agree on is the $300 price tag. So maybe ~260 quid?

Edit: If you are talking about the PC with GT 240 in it, that card is 70W. So you might want to have a look at the GT 640 which is already out.


Yes and no. The primary upgrade is to the E6600 which has an old 7950 card in it. That card is rated at 82W, but the computer has a 650W corsair PSU in it so can take a more powerful card but the owner is cash limited, like he cannot afford the price his brother spent on 670.

The friends computer has 2 * 8800GTX's in SLI mode, but that was funded by parents when at University, he now has to fund own upgrade, which he say's is £250 max.

My computer with the GT 240 can supply more power. Since I retired it has gone from four HDD's down to SSD plus 2.5" HDD. I've also taken out the DVD drive and the card reader, and now use ext USB blue ray drive and SD micro cards with USB adapter, if needed. So there has to be at least another 100W available. So a card at ~150W might be on the cards.
ID: 1246304 · Report as offensive
BetelgeuseFive Project Donor
Volunteer tester

Send message
Joined: 6 Jul 99
Posts: 158
Credit: 17,117,787
RAC: 19
Netherlands
Message 1246475 - Posted: 15 Jun 2012, 16:54:05 UTC - in response to Message 1246074.  



I'm fooling around with a GT640 now.

My advice is to hold-off a little while and let me get some good numbers. I've got numbers, but I have little confidence in them.

Don't quote me, but it initially looked like a single, unexceptional, WU run under the Lunatics' app was completing in the 30 minute range, give-or-take 10%.

The GT640 is *not* going to blow anyone away, even after some setup optimization.

I'm not saying it's a bad card. I'm saying it isn't a miracle.

First impression is that it will replace the GT240 and might be as much as 25% better at crunching, but that "25%" number is subject to broad revision.

I've been changing some things on it and I've pushed it too hard. <I have not tried overclocking it; that's not the setting I'm fooling-with> It's been producing errors. I've just dropped-back from 2 WUs at a time to 1 WU at a time. If I continue to get errors, I'll back-off something else.

If you go looking at my machines, just understand that whatever you see there for the GT640 is subject to a lot of change.


Please post some more details when more tasks have finished. 30 minutes is a bit disappointing. My GT240 (GDDR5) does shorties in 5 minutes and normal units in about 20 minutes. I was considering the GT640 as an upgrade, but I guess I'll have to wait for the GTX650.


ID: 1246475 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22189
Credit: 416,307,556
RAC: 380
United Kingdom
Message 1246558 - Posted: 15 Jun 2012, 20:19:58 UTC - in response to Message 1246475.  

The figures you quote are for ONE WU at a time - my '460 achieves very similar figures, but does two at a time and uses less power than the '250 it replaced.
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1246558 · Report as offensive
tbret
Volunteer tester
Avatar

Send message
Joined: 28 May 99
Posts: 3380
Credit: 296,162,071
RAC: 40
United States
Message 1246850 - Posted: 16 Jun 2012, 8:27:07 UTC - in response to Message 1246475.  


Please post some more details when more tasks have finished. 30 minutes is a bit disappointing. My GT240 (GDDR5) does shorties in 5 minutes and normal units in about 20 minutes. I was considering the GT640 as an upgrade, but I guess I'll have to wait for the GTX650.



You've heard of being "touched by an angel." I haven't been that lucky, but I'm being "guided by a guru" which is almost as good.

What I am posting, below, is NOT some sort of be-all, end-all comparison or reliable report of anything at all. Do not take this as written in stone. Do not make a decision based on what you see here. Do not believe what you see with your own eyes.

I have pushed the GT640 pretty hard with a not-for-prime-time release of a Lunatics' application **and** then messed with its settings (as instructed) until I pushed it over the edge.

AND these times include the CPU times, of course, and the CPU is an AMD FX-8120 and 667MHz DDR3 RAM. So someone running DDR3 RAM at 800MHz and a 4GHz Intel processor would trim these times a little.

AND I may be able to find settings for the card that will let me apply a little overclocking, OR I might be able to underclock the card slightly and push the envelope on the process priority.

So, realize that what you are looking at is general information; a first-look; a very broad accounting.

These numbers appear in a group of tasks my 560Ti is doing in 430-450 seconds, roughly.

EVGA DDR3 GT640, Lunatics preview application:


WU true angle range is : 0.364782 - 1,959.61sec

WU true angle range is : 0.443223 - 1,436.04sec

WU true angle range is : 0.364666 - 1,960.60sec

WU true angle range is : 0.364666 - 1,960.17sec

WU true angle range is : 0.364709 - 1,963.62sec

WU true angle range is : 0.430923 - 1,507.10sec

WU true angle range is : 0.430923 - 1,508.32sec

WU true angle range is : 0.444777 - 1,429.71sec

WU true angle range is : 0.444496 - 1,432.61sec

WU true angle range is : 0.414738 - 1,522.51sec

WU true angle range is : 0.414738 - 1,522.82sec



For Comparison, ASUS GT240, Lunatics' current release, even slower Athlon II 250:

WU true angle range is : 0.410321 - 2,011.59

WU true angle range is : 0.414895 - 1,926.31

WU true angle range is : 0.423764 - 1,886.21

WU true angle range is : 0.414856 - 1,934.74

WU true angle range is : 0.414856 - 1,942.31


So... the closest I can come to a meaningful comparison is that the GT640 is in the 20% faster range. How much of that is the card? I don't know. How much of that is the application? I don't know. How much of that is the CPU or bus speed or something else? I don't know that, either.

What I think I DO know is that the GT640 will replace a GT240 "pretty evenly" and that's not too shabby when you're talking about a card that will draw power from the PCIe slot. BUT, if money is the issue and you've already got enough power supply, for $10-20 more a GTX550Ti will beat its RAC with no problem.

Yes, the 550Ti uses more power and needs more fan to stay cool. So, if it's a power issue, get a GT640. If it's a money issue, get a GTX550Ti. If it's a RAC issue, get four GTX690s.

Let me put it another way... this is the first card I've known about since the GT240 was new that would keep-up with a GT240 while only drawing power from the PCIe slot.

That makes it, in my opinion, a great choice as a video card to go into things like office computers where you don't want this great whining fan, you don't want to produce all that heat, and you don't have a whole lot of excess power supply to play-around with.

It produces a really good image (as far as I can tell), it's quiet, it's cool running, and it's available for about $110. If you're getting a 5,000 RAC now on a GT240, you might get a 6,250 RAC with a GT640.

I'm still "playing" and adjusting to the card. Things could still go either way. I might have to cripple its hardware settings with this Lunatics build to make it stable, or I might be able to squeeze another 20% out of it goosing the voltage and overclocking slightly, or by doing some combination of those things and changing process priority settings, or... who knows what else.

But I don't think we're going to turn the GT640 into a GTX550Ti no matter what we do.

And that's fine. This card has its place and can do useful SETI work.
ID: 1246850 · Report as offensive
BetelgeuseFive Project Donor
Volunteer tester

Send message
Joined: 6 Jul 99
Posts: 158
Credit: 17,117,787
RAC: 19
Netherlands
Message 1246865 - Posted: 16 Jun 2012, 8:59:57 UTC - in response to Message 1246850.  



You've heard of being "touched by an angel." I haven't been that lucky, but I'm being "guided by a guru" which is almost as good.

What I am posting, below, is NOT some sort of be-all, end-all comparison or reliable report of anything at all. Do not take this as written in stone. Do not make a decision based on what you see here. Do not believe what you see with your own eyes.

I have pushed the GT640 pretty hard with a not-for-prime-time release of a Lunatics' application **and** then messed with its settings (as instructed) until I pushed it over the edge.

AND these times include the CPU times, of course, and the CPU is an AMD FX-8120 and 667MHz DDR3 RAM. So someone running DDR3 RAM at 800MHz and a 4GHz Intel processor would trim these times a little.

AND I may be able to find settings for the card that will let me apply a little overclocking, OR I might be able to underclock the card slightly and push the envelope on the process priority.

So, realize that what you are looking at is general information; a first-look; a very broad accounting.

These numbers appear in a group of tasks my 560Ti is doing in 430-450 seconds, roughly.

EVGA DDR3 GT640, Lunatics preview application:


WU true angle range is : 0.364782 - 1,959.61sec

WU true angle range is : 0.443223 - 1,436.04sec

WU true angle range is : 0.364666 - 1,960.60sec

WU true angle range is : 0.364666 - 1,960.17sec

WU true angle range is : 0.364709 - 1,963.62sec

WU true angle range is : 0.430923 - 1,507.10sec

WU true angle range is : 0.430923 - 1,508.32sec

WU true angle range is : 0.444777 - 1,429.71sec

WU true angle range is : 0.444496 - 1,432.61sec

WU true angle range is : 0.414738 - 1,522.51sec

WU true angle range is : 0.414738 - 1,522.82sec



For Comparison, ASUS GT240, Lunatics' current release, even slower Athlon II 250:

WU true angle range is : 0.410321 - 2,011.59

WU true angle range is : 0.414895 - 1,926.31

WU true angle range is : 0.423764 - 1,886.21

WU true angle range is : 0.414856 - 1,934.74

WU true angle range is : 0.414856 - 1,942.31


So... the closest I can come to a meaningful comparison is that the GT640 is in the 20% faster range. How much of that is the card? I don't know. How much of that is the application? I don't know. How much of that is the CPU or bus speed or something else? I don't know that, either.

What I think I DO know is that the GT640 will replace a GT240 "pretty evenly" and that's not too shabby when you're talking about a card that will draw power from the PCIe slot. BUT, if money is the issue and you've already got enough power supply, for $10-20 more a GTX550Ti will beat its RAC with no problem.

Yes, the 550Ti uses more power and needs more fan to stay cool. So, if it's a power issue, get a GT640. If it's a money issue, get a GTX550Ti. If it's a RAC issue, get four GTX690s.

Let me put it another way... this is the first card I've known about since the GT240 was new that would keep-up with a GT240 while only drawing power from the PCIe slot.

That makes it, in my opinion, a great choice as a video card to go into things like office computers where you don't want this great whining fan, you don't want to produce all that heat, and you don't have a whole lot of excess power supply to play-around with.

It produces a really good image (as far as I can tell), it's quiet, it's cool running, and it's available for about $110. If you're getting a 5,000 RAC now on a GT240, you might get a 6,250 RAC with a GT640.

I'm still "playing" and adjusting to the card. Things could still go either way. I might have to cripple its hardware settings with this Lunatics build to make it stable, or I might be able to squeeze another 20% out of it goosing the voltage and overclocking slightly, or by doing some combination of those things and changing process priority settings, or... who knows what else.

But I don't think we're going to turn the GT640 into a GTX550Ti no matter what we do.

And that's fine. This card has its place and can do useful SETI work.


Thanks for the extra info. I'm fully aware that I should not base my conclusions on the results from a single machine. I have one other question though: is your GT240 a DDR3 version or a GDDR5 version ? I checked some of your results but it only contains the GPU clock and not the memory clock. I have noticed in the past that this makes quite a big difference. I get the impression that none of the GT .40 models have been able to perform as well as the GT240 GDDR5. Shame the GT640 is (currently ?) available only with DDR3 memory.

ID: 1246865 · Report as offensive
tbret
Volunteer tester
Avatar

Send message
Joined: 28 May 99
Posts: 3380
Credit: 296,162,071
RAC: 40
United States
Message 1246886 - Posted: 16 Jun 2012, 11:26:44 UTC - in response to Message 1246865.  


Thanks for the extra info. I'm fully aware that I should not base my conclusions on the results from a single machine. I have one other question though: is your GT240 a DDR3 version or a GDDR5 version ? I checked some of your results but it only contains the GPU clock and not the memory clock. I have noticed in the past that this makes quite a big difference. I get the impression that none of the GT .40 models have been able to perform as well as the GT240 GDDR5. Shame the GT640 is (currently ?) available only with DDR3 memory.


You're welcome. I'm glad it was useful.

My GT240 is definitely a DDR3 version.

You aren't the first person to say that the DDR3 vs DDR5 issue makes a big difference.

Okay, like how big?

I just wonder why, in this day and age of DDR5 everywhere, if the use of DDR3 was going to cripple the performance (and therefore the competitive advantage) of their card, why would anyone use it?

My guess is that the difference DDR5 makes over DDR3 is more evident as the whole device gets faster so the more RAM-speed restricted it is. Usually it takes Gigabyte about 48 hours to bring-out a newer, better, faster, Super Over-Clock, monster-goosed, hot-plate of a component, but I haven't seen it with the GT640... although I read a rumor.

What do you suppose DDR5 would do for it? 20% boost? So a 20% boost to a 20% advantage is a 4% addition to a real advantage. How much would that "additional" 20% cost?

I'm not arguing. You make a good and valid point. It's one I did take into consideration when I ordered my GT640. I looked for a DDR5 model and it apparently doesn't exist.

I'm thinkin' if it got any closer to the GTX550Ti's price (only another $10-15), it had better perform almost twice as well as it does. I just don't believe the addition of DDR5 and boosting the clock 20% would result in GTX550Ti performance, but it likely would result in the GT640s being less price competitive.

Kind of like the ATI HD 6450 I got with DDR3. There's a DDR5 version that's better. But it is within pocket-change difference in the price of a DDR5 6670. So why would I buy that souped-up 6450?

The 6670 with DDR3 (if such a thing exists) would probably still whip the 6450 with DDR5.
ID: 1246886 · Report as offensive
W-K 666 Project Donor
Volunteer tester

Send message
Joined: 18 May 99
Posts: 19048
Credit: 40,757,560
RAC: 67
United Kingdom
Message 1246894 - Posted: 16 Jun 2012, 12:03:03 UTC - in response to Message 1246886.  

Okay, like how big?


About 20% in most cases, but the DDR5 version is limited to 512k and it cannot be overclocked. In fact it maybe underclocked.

The reason why is that it is limited to 75W of power.

Mine is also a DDR3, but I don't care as it was a leftover component after I upgraded a computer for someone who doesn't know how easy it is. So cost to me about 30 mins work.
ID: 1246894 · Report as offensive
BetelgeuseFive Project Donor
Volunteer tester

Send message
Joined: 6 Jul 99
Posts: 158
Credit: 17,117,787
RAC: 19
Netherlands
Message 1246952 - Posted: 16 Jun 2012, 15:04:15 UTC - in response to Message 1246894.  

Okay, like how big?


About 20% in most cases, but the DDR5 version is limited to 512k and it cannot be overclocked. In fact it maybe underclocked.

The reason why is that it is limited to 75W of power.

Mine is also a DDR3, but I don't care as it was a leftover component after I upgraded a computer for someone who doesn't know how easy it is. So cost to me about 30 mins work.


The GT240 GDDR5 can be overclocked. I'm running it at 594 MHz core clock and 1446.4 MHz shader clock. Memory was running at 1802 MHz until I upgraded to 301.42 driver. With that version I have problems with EVGA Precision at startup (it will downclock memory until I restart BOINC for some weird reason).
I think it is fair to say that it is 20 to 25 percent faster compared to the DDR3 version. Check my results (only one computer, all GT240 results) if you want to compare them.
ID: 1246952 · Report as offensive
Previous · 1 . . . 3 · 4 · 5 · 6 · 7 · Next

Message boards : Number crunching : Finally! GPU wars 2012 - GTX 650 Ti reviews


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.