Questions and Answers :
GPU applications :
Suspending computation triggers GTX560 ti "failsafe mode"?
Message board moderation
Author | Message |
---|---|
Jopj(FIN) Send message Joined: 19 Dec 07 Posts: 4 Credit: 1,124,569 RAC: 0 |
I have just replaced my 8800 GTS with a 560 ti, as the former sadly reached the end of it's life. While running seti, the 560 has far superior performance when compared to the 8800, but seems to suffer from a glitch with its "failsafe mode", downclocking to 405 mhz. Normally I'd think that the card was unstable, and that would trigger the failsafe as intended. This however happens at stock frequencies, with the card passing stress tests (memtestCL, occt gpu test with artifact detection) with flying colours on long runs, and working flawlessy with seti unless I suspend computation at any point, and generally not displaying any signs of unstability at all. Could it be that the sudden stop of computing caused by suspending seti causes the card (or driver, whichever is responsible for the failsafe) to think the card has failed those calculations? After I reboot the PC, resetting the failsafe, seti once again works without a hitch until I suspend computing, causing the gpu to be stuck at 405mhz until reboot. The GPU is not overheating, running at a max temperature of 64C at full seti load. Neither is this a case of insufficient power to the card, the 650W corsair PSU I'm using is quite adequate, and has succesfully run loads far greater than it does now. I run seti very sporadically on this PC due to it's power consumption and the way it heats up my room, so getting suspend to work well would be great. Thanks for any pointers on this. |
Gundolf Jahn Send message Joined: 19 Sep 00 Posts: 3184 Credit: 446,358 RAC: 0 |
Which driver version are you using? There've been reports of downclocking with the newest drivers (270.??) on the Number crunching subforum. Gruß, Gundolf Computer sind nicht alles im Leben. (Kleiner Scherz) SETI@home classic workunits 3,758 SETI@home classic CPU time 66,520 hours |
Jopj(FIN) Send message Joined: 19 Dec 07 Posts: 4 Credit: 1,124,569 RAC: 0 |
Those are the drivers I'm using, 270,61, thanks. Someone over there mentioned the lunatics optimized gpu apps causing some trouble, I'll uninstall those and see where it leads. Unfortunately rolling back drivers isn't a favourite option for me, as the previous ones for the 560 are not unified drivers and do not support the 7500 LE I have in my pc :D That seems to be the problem I'm having, so I'll guess I just have to wait for new drivers, hoping they resolve the issue, or some other fix. |
f_n_t Send message Joined: 30 Apr 02 Posts: 6 Credit: 77,305,292 RAC: 3 |
VDDC is not steady to Clock 900MHz of GIGABYTE Geforce GTX 560 Ti card because it is low with 1.000V. The seti@CUDA stability operation limit frequency of each voltage of GIGABYTE Geforce GTX 560 Ti card (GV-N560OC-1GI) was measured with MSI Afterburner. Let's increase this list in the standard and increase VDDC to 1.025V(@900MHz). Or, The stability operation is done to 850.5MHz(@1.000V) GPU Core Clock if it lowers. VDDC , GPU Core Clock (GPU-Z 0.5.3) 1.100V 1012.5MHz 1.087V 996.9MHz 1.075V 981.3MHz 1.062V 967.5MHz 1.050V 950.2MHz 1.037V 934.6MHz 1.025V 918.0MHz 1.012V 904.5MHz 1.000V 885.9MHz 0.987V 872.3MHz 0.975V 850.5MHz 0.962V 837.0MHz 0.950V 810.0MHz Do not you read easily because it translated Japanese in the machine? |
Jopj(FIN) Send message Joined: 19 Dec 07 Posts: 4 Credit: 1,124,569 RAC: 0 |
If I understand that table right, it is a guideline of voltage to core frequency. Mine is running at 830 mhz with 0.987V, it's stock settings, so it should be fine. The problem seems to be solved for now, but not very well. I pulled the 7500 LE out to see if it was causing problems (it has, on occasion), and connected the second display to the 560. This results in the 560 being pegged at it's max 3d clock at all times, even though this multi-monitor issue with powersaving was supposedly fixed a few driver versions ago. Well, the card isn't underclocking now, as it's not underclocking at all. I did like the cool temperatures and low power consumption of the 51mhz idle clocks, though. Do I have to make a tradeoff with working powersaving against random downclocking? Thanks. |
f_n_t Send message Joined: 30 Apr 02 Posts: 6 Credit: 77,305,292 RAC: 3 |
To do powersaving with multi-monitor NVIDIA Inspector 1.95 – Tool http://blog.orbmu2k.de/tools/nvidia-inspector-tool Multi Display Power Saver http://www.3dcenter.org/artikel/nvidia-inspector-bringt-komfortables-multi-display-power-saving/multi-display-power-saver It is right-click > Multi Display Power Saver as for ShowOverclocking of nvidiaInspector. Changeable clock setting P0⇔P12 or P8⇔P12 http://uploda.jisakupc.info/file/35.png of [suru] according to load Setting P0⇔P8 not lowered to P12 http://uploda.jisakupc.info/file/36.png P0 = ratings, P8=405MHz, P12=51MHz P8, and P12 is decompressed and 2/3 Shader circuits sleep. Because the Shader circuit of dependence 2/3 that maintains a low power consumption mode that is 50W lower than the maximum value sleeps when making to P8 and P12State after the memory clock is dropped to 1000MHz and returning it to P0, 75W/3*2=50W. |
Jopj(FIN) Send message Joined: 19 Dec 07 Posts: 4 Credit: 1,124,569 RAC: 0 |
Thanks, I got that nvidia inspector working, but i'm not keen on having more backround utilities. I'll see if I can get the 7500 playing nice so that I can unload my second display on it. Seti seems to like it now, even if I don't know if the problem went away with the lunatics gpu apps, or merely masked by gpu either not going into power save or it being enforced by a third party utility. |
BilBg Send message Joined: 27 May 07 Posts: 3720 Credit: 9,385,827 RAC: 0 |
http://blog.orbmu2k.de/tools/nvidia-inspector-tool Google Translate: http://translate.google.bg/translate?hl=en&sl=auto&tl=en&u=http%3A%2F%2Fblog.orbmu2k.de%2Ftools%2Fnvidia-inspector-tool The text states that this behavior is by design (to avoid flicker): " Save energy when using multiple monitors Since NVIDIA GDDR5 graphics card is used as a video store, there is a limitation on the power management features. If more than a single display with different resolutions / timings are connected to the graphics card, the driver prevents downshift automatically in that energy-saving "Performance States (P-States). The reason is, according to NVIDIA in a hardware limitation in the use of GDDR5 memory, which can cause change of P-states to flicker. This issue is therefore not limited to NVIDIA graphics cards. Unfortunately, the driver the user does not choose the flicker to be accepted instead to save power. This fact has turned to a new feature "Multi Display Power Saver". " Â - ALF - "Find out what you don't do well ..... then don't do it!" :) Â |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.