Just wondering if there's any good reason to upgrade anything?

Message boards : Number crunching : Just wondering if there's any good reason to upgrade anything?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

AuthorMessage
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1230385 - Posted: 11 May 2012, 3:50:33 UTC
Last modified: 11 May 2012, 3:56:59 UTC

Well, my cache is finally cleared out. What is the procedure for helping with the beta testing, is it a big commitment or just install & run, and report back with any errors/issues found? I presume I'll have to do another upgrade, this time to 7.x, would I need to update the video drivers as well, or can I stay with these for the time being?

ID: 1230385 · Report as offensive
Claggy
Volunteer tester

Send message
Joined: 5 Jul 99
Posts: 4654
Credit: 47,537,079
RAC: 4
United Kingdom
Message 1230455 - Posted: 11 May 2012, 8:24:09 UTC - in response to Message 1230385.  

Well, my cache is finally cleared out. What is the procedure for helping with the beta testing, is it a big commitment or just install & run, and report back with any errors/issues found? I presume I'll have to do another upgrade, this time to 7.x, would I need to update the video drivers as well, or can I stay with these for the time being?

Just attach to Seti Beta, you can keep the same drivers, and the same Boinc version, have a read of the Please read first post for info about Beta testing,

Claggy
ID: 1230455 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233066 - Posted: 18 May 2012, 22:07:57 UTC

Well, since my last post I have decided to up the ante a little. I ordered a couple GTX560Ti 448 Core Ultra cards, and got the 1st one installed the first day of the outage. Of course the drivers I had loaded didn't recognize the card, so I finally ended up nuking those drivers that I had just gotten working late last month, and upgraded to the latest beta version after spending Way too much time trying to use the ones that came on the CD packaged with the new card, after the existing ones failed. I finally got all the cards seen properly by the OS (as far as I can tell) but of course have one more issue. BOINC won't use the 3 existing GTX 260 cards. I don't know exactly what may have caused this, as I didn't change anything in BOINC, just the vid driver. My startup screen reads:

5/18/2012 4:45:17 PM Starting BOINC client version 6.10.60 for windows_x86_64
5/18/2012 4:45:17 PM log flags: file_xfer, sched_ops, task
5/18/2012 4:45:17 PM Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
5/18/2012 4:45:17 PM Data directory: C:\Documents and Settings\All Users\Application Data\BOINC
5/18/2012 4:45:17 PM Running under account Administrator
5/18/2012 4:45:17 PM Processor: 8 GenuineIntel Intel(R) Core(TM) i7 CPU 950 @ 3.07GHz [Family 6 Model 26 Stepping 5]
5/18/2012 4:45:17 PM Processor: 256.00 KB cache
5/18/2012 4:45:17 PM Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss htt tm pni ssse3 cx16 sse4_1 sse4_2 syscall nx lm vmx tm2 popcnt pbe
5/18/2012 4:45:17 PM OS: Microsoft Windows XP: Professional x64 Edition, Service Pack 2, (05.02.3790.00)
5/18/2012 4:45:17 PM Memory: 5.99 GB physical, 7.61 GB virtual
5/18/2012 4:45:17 PM Disk: 232.88 GB total, 209.86 GB free
5/18/2012 4:45:17 PM Local time is UTC -5 hours
5/18/2012 4:45:17 PM NVIDIA GPU 0: GeForce GTX 560 Ti (driver version 30124, CUDA version 4020, compute capability 2.0, 1280MB, 1679 GFLOPS peak)
5/18/2012 4:45:17 PM NVIDIA GPU 1 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 4:45:17 PM NVIDIA GPU 2 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 4:45:17 PM NVIDIA GPU 3 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 4:45:17 PM SETI@home Found app_info.xml; using anonymous platform
5/18/2012 4:45:25 PM SETI@home URL http://setiathome.berkeley.edu/; Computer ID 5873188; resource share 100
5/18/2012 4:45:25 PM SETI@home General prefs: from SETI@home (last modified 26-May-2007 00:12:51)
5/18/2012 4:45:25 PM SETI@home Computer location: home
5/18/2012 4:45:25 PM SETI@home General prefs: no separate prefs for home; using your defaults
5/18/2012 4:45:25 PM Reading preferences override file
5/18/2012 4:45:25 PM Preferences:
5/18/2012 4:45:25 PM max memory usage when active: 3067.44MB
5/18/2012 4:45:25 PM max memory usage when idle: 6073.53MB
5/18/2012 4:45:25 PM max disk usage: 100.00GB
5/18/2012 4:45:25 PM (to change preferences, visit the web site of an attached project, or select Preferences in the Manager)
5/18/2012 4:45:25 PM Not using a proxy

If you guys have any idea how to get BOINC to use those other cards, I'd appreciate it. Hopefully it's something fairly easy to correct, maybe a setting somewhere? Thanks!


ID: 1233066 · Report as offensive
Wembley
Volunteer tester
Avatar

Send message
Joined: 16 Sep 09
Posts: 429
Credit: 1,844,293
RAC: 0
United States
Message 1233075 - Posted: 18 May 2012, 22:25:16 UTC - in response to Message 1233066.  

You need to add an option to your cc_config.xml file. Here is the wiki page: http://boinc.berkeley.edu/wiki/Client_configuration
Specifically you want the <use_all_gpus> option
ID: 1233075 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233096 - Posted: 18 May 2012, 22:46:52 UTC - in response to Message 1233075.  

Thanks for the heads up, I looked around but couldn't find my BOINC data dir, so I thought I'd be clever and just do a quick search for the cc_config.xml file on my C:\ drive. Surprise, it didn't find it. I've been running this rig as is for a while now, and either it somehow got wiped out, or never had one. Any idea as what to do? Do you need any more info about this rig to help you determine the next step?

ID: 1233096 · Report as offensive
Profile Gundolf Jahn

Send message
Joined: 19 Sep 00
Posts: 3184
Credit: 446,358
RAC: 0
Germany
Message 1233101 - Posted: 18 May 2012, 22:53:57 UTC - in response to Message 1233096.  
Last modified: 18 May 2012, 22:54:45 UTC

Surprise, it didn't find it.

No surprise if you never created it. ;-) It isn't created by default.

Search for client_state.xml instead to find the data directory.

Gruß,
Gundolf
ID: 1233101 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13727
Credit: 208,696,464
RAC: 304
Australia
Message 1233104 - Posted: 18 May 2012, 22:55:36 UTC - in response to Message 1233096.  
Last modified: 18 May 2012, 22:56:22 UTC

Surprise, it didn't find it.

No surprise, you won't have one unless you make it.

For me it's located here,
C:\ProgramData\boinc\

This is what i've got in mine.

<cc_config>
<options>
<save_stats_days>3700</save_stats_days>
<max_file_xfers>4</max_file_xfers>
<max_file_xfers_per_project>4</max_file_xfers_per_project>
</options>
</cc_config>
Grant
Darwin NT
ID: 1233104 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233112 - Posted: 18 May 2012, 23:07:05 UTC - in response to Message 1233104.  
Last modified: 18 May 2012, 23:07:48 UTC

Ahh, found my data dir by searching for a partial task name. Mine is in my C:\Documents and Settings\All Users\Application Data\BOINC\projects folder. I remember a few years ago creating one of these, and it had a bit more stuff in it than yours did. How does one who isn't well versed in the ins and outs of cc_config file optimization get the scoop on creating one that will work with my current setup. I tried reading the http://boinc.berkeley.edu/wiki/Client_configuration page, and it was fairly Greek to me. If I could get an example file that has most of the options I would need, that would be very helpful.

ID: 1233112 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13727
Credit: 208,696,464
RAC: 304
Australia
Message 1233114 - Posted: 18 May 2012, 23:12:13 UTC - in response to Message 1233112.  


From the link,
<use_all_gpus>0|1</use_all_gpus> If 1, use all GPUs (otherwise only the most capable ones are used)


So

<cc_config>
<options>
<use_all_gpus>1</use_all_gpus>
</options>
</cc_config>

should do it.

Open a new file with Notepad, put in the bits above & save it as
cc_config.xml
in your BOINC data directory.
Exit & restart BOINC.
Grant
Darwin NT
ID: 1233114 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233117 - Posted: 18 May 2012, 23:19:01 UTC - in response to Message 1233114.  
Last modified: 18 May 2012, 23:25:35 UTC

Thanks, created & saved it, then restarted BOINC. Still says:

5/18/2012 6:14:31 PM Starting BOINC client version 6.10.60 for windows_x86_64
5/18/2012 6:14:31 PM log flags: file_xfer, sched_ops, task
5/18/2012 6:14:31 PM Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
5/18/2012 6:14:31 PM Data directory: C:\Documents and Settings\All Users\Application Data\BOINC
5/18/2012 6:14:31 PM Running under account Administrator
5/18/2012 6:14:31 PM Processor: 8 GenuineIntel Intel(R) Core(TM) i7 CPU 950 @ 3.07GHz [Family 6 Model 26 Stepping 5]
5/18/2012 6:14:31 PM Processor: 256.00 KB cache
5/18/2012 6:14:31 PM Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss htt tm pni ssse3 cx16 sse4_1 sse4_2 syscall nx lm vmx tm2 popcnt pbe
5/18/2012 6:14:31 PM OS: Microsoft Windows XP: Professional x64 Edition, Service Pack 2, (05.02.3790.00)
5/18/2012 6:14:31 PM Memory: 5.99 GB physical, 7.61 GB virtual
5/18/2012 6:14:31 PM Disk: 232.88 GB total, 209.86 GB free
5/18/2012 6:14:31 PM Local time is UTC -5 hours
5/18/2012 6:14:31 PM NVIDIA GPU 0: GeForce GTX 560 Ti (driver version 30124, CUDA version 4020, compute capability 2.0, 1280MB, 1692 GFLOPS peak)
5/18/2012 6:14:31 PM NVIDIA GPU 1 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:14:31 PM NVIDIA GPU 2 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:14:31 PM NVIDIA GPU 3 (not used): GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:14:31 PM SETI@home Found app_info.xml; using anonymous platform
5/18/2012 6:14:38 PM SETI@home URL http://setiathome.berkeley.edu/; Computer ID 5873188; resource share 100
5/18/2012 6:14:38 PM SETI@home General prefs: from SETI@home (last modified 26-May-2007 00:12:51)
5/18/2012 6:14:38 PM SETI@home Computer location: home
5/18/2012 6:14:38 PM SETI@home General prefs: no separate prefs for home; using your defaults
5/18/2012 6:14:38 PM Reading preferences override file
5/18/2012 6:14:38 PM Preferences:
5/18/2012 6:14:38 PM max memory usage when active: 3067.44MB
5/18/2012 6:14:38 PM max memory usage when idle: 6073.53MB
5/18/2012 6:14:38 PM max disk usage: 100.00GB
5/18/2012 6:14:38 PM (to change preferences, visit the web site of an attached project, or select Preferences in the Manager)
5/18/2012 6:14:39 PM Not using a proxy


I put it in the C:\Documents and Settings\All Users\Application Data\BOINC\projects\setiathome.berkeley.edu folder, and just checked to make sure it is still there. are there any formatting requirements I should be aware of? Mine is all against the left tab, should descending lines be spaced one line out, or doesn't that matter? Also curious why I need this file now with this vid card added and driver updated, when it was working fine with all 4 GTX260s and the optimized app? Is it using multiple types of cards, the new driver requires it, or something else? I would think if it worked before, it should have worked fine now, but then again, what do I know? ;-) That's why I ask the experts here!

ID: 1233117 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1233121 - Posted: 18 May 2012, 23:22:38 UTC - in response to Message 1233117.  

I put it in the C:\Documents and Settings\All Users\Application Data\BOINC\projects\setiathome.berkeley.edu folder, and just checked to make sure it is still there. are there any formatting requirements I should be aware of? Mine is all against the left tab, should decending lines be spaced one line out, or doesn't that matter?

It's a BOINC file, not a SETI file - move it two folders further towards the root.

White space doesn't matter, provided it's a plain text file.
ID: 1233121 · Report as offensive
Wembley
Volunteer tester
Avatar

Send message
Joined: 16 Sep 09
Posts: 429
Credit: 1,844,293
RAC: 0
United States
Message 1233124 - Posted: 18 May 2012, 23:26:47 UTC

This line in your log shows you where your data directory is:
5/18/2012 6:14:31 PM Data directory: C:\Documents and Settings\All Users\Application Data\BOINC

Put the cc_config.xml file in that directory.

ID: 1233124 · Report as offensive
Wembley
Volunteer tester
Avatar

Send message
Joined: 16 Sep 09
Posts: 429
Credit: 1,844,293
RAC: 0
United States
Message 1233126 - Posted: 18 May 2012, 23:28:47 UTC - in response to Message 1233117.  

Also curious why I need this file now with this vid card added and driver updated, when it was working fine with all 4 GTX260s and the optimized app? Is it using multiple types of cards, the new driver requires it, or something else? I would think if it worked before, it should have worked fine now, but then again, what do I know? ;-) That's why I ask the experts here!

Your new 560ti is a lot faster than your old 260s. BOINC by default will only use the faster cards in a mixed system.
ID: 1233126 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233129 - Posted: 18 May 2012, 23:32:29 UTC

Working now, thanks guys. Still weird that it was needed at all when it was working with all 4 before the latest driver update. Oh well, as long as it's working, I'm all good!

5/18/2012 6:26:25 PM Starting BOINC client version 6.10.60 for windows_x86_64
5/18/2012 6:26:25 PM Config: use all coprocessors
5/18/2012 6:26:25 PM log flags: file_xfer, sched_ops, task
5/18/2012 6:26:25 PM Libraries: libcurl/7.19.7 OpenSSL/0.9.8l zlib/1.2.3
5/18/2012 6:26:25 PM Data directory: C:\Documents and Settings\All Users\Application Data\BOINC
5/18/2012 6:26:25 PM Running under account Administrator
5/18/2012 6:26:25 PM Processor: 8 GenuineIntel Intel(R) Core(TM) i7 CPU 950 @ 3.07GHz [Family 6 Model 26 Stepping 5]
5/18/2012 6:26:25 PM Processor: 256.00 KB cache
5/18/2012 6:26:25 PM Processor features: fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush dts acpi mmx fxsr sse sse2 ss htt tm pni ssse3 cx16 sse4_1 sse4_2 syscall nx lm vmx tm2 popcnt pbe
5/18/2012 6:26:25 PM OS: Microsoft Windows XP: Professional x64 Edition, Service Pack 2, (05.02.3790.00)
5/18/2012 6:26:25 PM Memory: 5.99 GB physical, 7.61 GB virtual
5/18/2012 6:26:25 PM Disk: 232.88 GB total, 209.94 GB free
5/18/2012 6:26:25 PM Local time is UTC -5 hours
5/18/2012 6:26:25 PM NVIDIA GPU 0: GeForce GTX 560 Ti (driver version 30124, CUDA version 4020, compute capability 2.0, 1280MB, 1692 GFLOPS peak)
5/18/2012 6:26:25 PM NVIDIA GPU 1: GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:26:25 PM NVIDIA GPU 2: GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:26:25 PM NVIDIA GPU 3: GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/18/2012 6:26:25 PM SETI@home Found app_info.xml; using anonymous platform
5/18/2012 6:26:31 PM SETI@home URL http://setiathome.berkeley.edu/; Computer ID 5873188; resource share 100
5/18/2012 6:26:31 PM SETI@home General prefs: from SETI@home (last modified 26-May-2007 00:12:51)
5/18/2012 6:26:31 PM SETI@home Computer location: home
5/18/2012 6:26:31 PM SETI@home General prefs: no separate prefs for home; using your defaults
5/18/2012 6:26:31 PM Reading preferences override file
5/18/2012 6:26:31 PM Preferences:
5/18/2012 6:26:31 PM max memory usage when active: 3067.44MB
5/18/2012 6:26:31 PM max memory usage when idle: 6073.53MB
5/18/2012 6:26:31 PM max disk usage: 100.00GB
5/18/2012 6:26:31 PM (to change preferences, visit the web site of an attached project, or select Preferences in the Manager)
5/18/2012 6:26:32 PM Not using a proxy

It's kind of funny, I took a look at my EVGA Precision software, and it shows that even after OCing my 560 to what I have read is a fairly high OC for this card, my temp on that card is around 47-49 right now, but my 260's are all at 66-72 depending on the slot. It's around 85 here today, so hotter than usual in this room, but am quite surprised at those temp differences. Happy, but surprised.

ID: 1233129 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1233994 - Posted: 20 May 2012, 11:38:57 UTC

Time for another update, this time to get some ideas as to what's going on with a couple strange issues. First of all, it appears to be crunching fine best as I can tell, with all the new cards I installed this week. The strangest thing is the temp readings I am getting on my new 560Ti cards. I started them off running at stock speeds, but then slowly bumped them up. I was running them at 930 core and 2033 memory, which from what I read is a VG OC on that card, and the temps are reading 32-33, and it's below 60F outside this morning here. My last 260 is running currently at 57 and my 585 is running at 60-61. Just for grins, I bumped them up to 940 core clock, and nothing happened to the temps. Something weird is going on with these cards, they seem to be almost at idling temps, but these are the 1st 500 series cards I've used, previously they've all been 200 series or earlier. I'm using EVGA Precision 2.0.3 if that makes any difference. While I've been writing this, I've been trying different settings, even taking them up to a core clock of 960, and this didn't seem to make a diff on temps either. Something is up, but I can't figure it out, they seem to have a mind of their own, at least going by temp readings.

The other strange thing is that on my active computers page http://setiathome.berkeley.edu/hosts_user.php?sort=rpc_time&rev=0&show_all=0&userid=7810325 it shows under GPU that I have [4] NVIDIA GeForce GTX 560 Ti (1279MB) driver: 301.24 when actually I have 2 of those, one GTX260 and one GTX285. Not a big deal, but thought it was strange that it showed the wrong cards for 2 of the 4. Any thoughts on either of those issues?

ID: 1233994 · Report as offensive
Profile Slavac
Volunteer tester
Avatar

Send message
Joined: 27 Apr 11
Posts: 1932
Credit: 17,952,639
RAC: 0
United States
Message 1233998 - Posted: 20 May 2012, 11:43:07 UTC - in response to Message 1233994.  

That sounds like a downclock to me. Give it a restart and see what the temps do?


Executive Director GPU Users Group Inc. -
brad@gpuug.org
ID: 1233998 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1234035 - Posted: 20 May 2012, 13:01:17 UTC - in response to Message 1233998.  
Last modified: 20 May 2012, 13:20:42 UTC

Hmm, they jumped about 15 degrees. So, what was that, what caused it, and how do I turn it off so it doesn't happen again? Is there a way to disable it? I'd prefer not to have to reboot my machine if possible. Thoughts, and Thanks!

*edit* one interesting thing I noticed was the performance difference between my cards now. It shows:
5/20/2012 8:08:07 AM NVIDIA GPU 0: GeForce GTX 560 Ti (driver version 30124, CUDA version 4020, compute capability 2.0, 1280MB, 1665 GFLOPS peak)
5/20/2012 8:08:07 AM NVIDIA GPU 1: GeForce GTX 560 Ti (driver version 30124, CUDA version 4020, compute capability 2.0, 1280MB, 1665 GFLOPS peak)
5/20/2012 8:08:07 AM NVIDIA GPU 2: GeForce GTX 260 (driver version 30124, CUDA version 4020, compute capability 1.3, 896MB, 630 GFLOPS peak)
5/20/2012 8:08:07 AM NVIDIA GPU 3: GeForce GTX 285 (driver version 30124, CUDA version 4020, compute capability 1.3, 1024MB, 773 GFLOPS peak)

My biggest surprise it the apparently small difference, at least in the GFLOPS rating, between the 260 and the 285 cards. I thought there was a larger difference when making that perceived big jump from the mid to the top of the line 200 series single GPU card, but looking at reviews when the card came out in early 2009, I guess they weren't that far apart. Just goes to show you how technology changes, usually for the better.

ID: 1234035 · Report as offensive
Profile Wiggo
Avatar

Send message
Joined: 24 Jan 00
Posts: 34744
Credit: 261,360,520
RAC: 489
Australia
Message 1234041 - Posted: 20 May 2012, 13:16:08 UTC - in response to Message 1234035.  

Hmm, they jumped about 15 degrees. So, what was that, what caused it, and how do I turn it off so it doesn't happen again? Is there a way to disable it? I'd prefer not to have to reboot my machine if possible. Thoughts, and Thanks!

Likely your overclock caused the downclock. ;)

Cheers.
ID: 1234041 · Report as offensive
Profile Slavac
Volunteer tester
Avatar

Send message
Joined: 27 Apr 11
Posts: 1932
Credit: 17,952,639
RAC: 0
United States
Message 1234042 - Posted: 20 May 2012, 13:17:04 UTC - in response to Message 1234035.  

Hmm, they jumped about 15 degrees. So, what was that, what caused it, and how do I turn it off so it doesn't happen again? Is there a way to disable it? I'd prefer not to have to reboot my machine if possible. Thoughts, and Thanks!


Long story short you can't turn off downclocking. It was triggered as a result to your overclocking. You'll have to find a happy medium between OC and avoiding the downclocks.


Executive Director GPU Users Group Inc. -
brad@gpuug.org
ID: 1234042 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1234043 - Posted: 20 May 2012, 13:22:59 UTC - in response to Message 1234042.  

So it is something implemented in the silicone? No firmware hack to disable it, or I suppose if there was, it would void the warranty? Grr, the games we're forced to play... ;-)

ID: 1234043 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · 6 · 7 · Next

Message boards : Number crunching : Just wondering if there's any good reason to upgrade anything?


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.