Message boards :
Number crunching :
SETI@home v8.22 Windows GPU applications support thread
Message board moderation
Previous · 1 . . . 16 · 17 · 18 · 19 · 20 · 21 · 22 . . . 23 · Next
Author | Message |
---|---|
TouchuvGrey Send message Joined: 30 May 04 Posts: 10 Credit: 52,541,731 RAC: 39 |
Hello Stephen: i reinstalled afterburner, waited a few minutes and all good now. Because the universe is just too big for us to be alone in. |
Javajav Send message Joined: 5 Feb 19 Posts: 4 Credit: 6,474 RAC: 0 |
Sorry i took so long to reply. I managed to fix it myself. Turns out my GPU is so outdated that the new drivers just made it crap out. So I installed a legacy driver and now it seems to be working fine. |
Stephen "Heretic" Send message Joined: 20 Sep 12 Posts: 5557 Credit: 192,787,363 RAC: 628 |
Sorry i took so long to reply. I managed to fix it myself. Turns out my GPU is so outdated that the new drivers just made it crap out. So I installed a legacy driver and now it seems to be working fine. . . Maybe there is a glitch in Boinc/Seti but it only shows one rig, an i7-7700 with no GPU that has been sitting on 26 WUs for several weeks ... Stephen ? ? |
Gammagoat Send message Joined: 19 Jul 07 Posts: 8 Credit: 936,684 RAC: 0 |
Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL. Is there enough of a difference for me to make the change? Currently taking about 17-18 mins to complete one wu at a time on my 1080 Ti. |
Jimbocous Send message Joined: 1 Apr 13 Posts: 1853 Credit: 268,616,081 RAC: 1,349 |
Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL. I think you'll find a difference. The good news is you should be able to just rerun the installer. |
Gammagoat Send message Joined: 19 Jul 07 Posts: 8 Credit: 936,684 RAC: 0 |
Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL. Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now? Sad watching my RX 580's wipe the floor with my 1080ti and they are only running 12 hrs a day. |
Jimbocous Send message Joined: 1 Apr 13 Posts: 1853 Credit: 268,616,081 RAC: 1,349 |
Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now?. Yes, running it dry would be prudent, though I think the old executable will remain it may no longer be referenced in the app_info.xml. Could leave a couple there, and see what happens. Would be interesting to know. l8r, Jim ... |
Stephen "Heretic" Send message Joined: 20 Sep 12 Posts: 5557 Credit: 192,787,363 RAC: 628 |
Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now?. . . The Lunatics installer creates a very large app_info.xml with every required form of app so the existing WUs will not bomb out. They will still show as cuda but will be processed by the selected app, SoG r3557 does good work. Stephen :) |
Gammagoat Send message Joined: 19 Jul 07 Posts: 8 Credit: 936,684 RAC: 0 |
Good Lord, that's better. like about 1/3rd the time as before. I have a machine with a 1060 and it only got a single cpu wu's, project says out of gpu work? Ok, but why send me cpu work? Ran the installer, choose wisely this time, no GPU work? I get this, 6/11/2019 4:20:04 PM | SETI@home | Finished upload of 25jl12ae.15192.21335.5.32.169_0_r1979829862_0 6/11/2019 4:20:23 PM | SETI@home | update requested by user 6/11/2019 4:20:26 PM | SETI@home | Sending scheduler request: Requested by user. 6/11/2019 4:20:26 PM | SETI@home | Reporting 1 completed tasks 6/11/2019 4:20:26 PM | SETI@home | Requesting new tasks for NVIDIA GPU 6/11/2019 4:20:27 PM | SETI@home | Scheduler request completed: got 0 new tasks 6/11/2019 4:20:27 PM | SETI@home | No tasks sent 6/11/2019 4:20:27 PM | SETI@home | No tasks are available for SETI@home v8 6/11/2019 4:20:27 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them 6/11/2019 4:20:27 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them 6/11/2019 4:20:27 PM | SETI@home | This computer has finished a daily quota of 1 tasks 6/11/2019 4:25:31 PM | SETI@home | Sending scheduler request: To fetch work. 6/11/2019 4:25:31 PM | SETI@home | Requesting new tasks for NVIDIA GPU 6/11/2019 4:25:32 PM | SETI@home | Scheduler request completed: got 0 new tasks 6/11/2019 4:25:32 PM | SETI@home | No tasks sent 6/11/2019 4:25:32 PM | SETI@home | No tasks are available for SETI@home v8 6/11/2019 4:25:32 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them 6/11/2019 4:25:32 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them 6/11/2019 4:25:32 PM | SETI@home | This computer has finished a daily quota of 1 tasks Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file? |
Zalster Send message Joined: 27 May 99 Posts: 5517 Credit: 528,817,460 RAC: 242 |
6/11/2019 4:20:27 PM | SETI@home | This computer has finished a daily quota of 1 tasks This means that computer is in the penalty box and restricted to how many task it will get in a day. Either there were a lot of errors or invalids that cause the server to think something was wrong with the computer. As you complete and get validate tasks, the limit will slowly increase. But this will take time. Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file? You will need to make an app_config.xml. I don't have access to my old files with the required app_config but I'm sure someone will post it here for you. You will need to place it in the Seti@home folder and turn off BOINC and restart it. If all goes well, you should see that BOINC recognizes the app_config.xml and implement the changes. |
Stephen "Heretic" Send message Joined: 20 Sep 12 Posts: 5557 Credit: 192,787,363 RAC: 628 |
Good Lord, that's better. like about 1/3rd the time as before. . . OK did you select a GPU for crunching when you ran the Lunatics installer? Also does the video driver on that machine have support for OpenCL? M$ stuffs things up constantly by "updating" video drivers with their own version which does not have OpenCL support. If you open your BOINC event log file go right back to the beginning and confirm that it sees drivers for OpenCL. If that has been overwritten because Boinc has been running for longer than the buffer holds close Boinc and re-open it then check the log. Or, as Zalster wrote, your host has been sent to the 'sin bin' due to too many errors, have you had a large number of errors or trashed WUs? Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file? . . Actually you don't need to touch app_info.xml. There is a nice little file called app_config.xml which you can edit with Notepad. If it is empty add the following ... <app_config> <app_version> <app_name>setiathome_v8</app_name> <plan_class>opencl_nvidia_SoG</plan_class> <avg_ncpus>0.25</avg_ncpus> <ngpus>0.49</ngpus> </app_version> <app_version> <app_name>astropulse_v7</app_name> <plan_class>opencl_nvidia_100</plan_class> <avg_ncpus>0.25</avg_ncpus> <ngpus>0.99</ngpus> </app_version> </app_config> . . If there is something similar there already then change the line <ngpus>1</ngpus> from 1 to 0.5 (I like to use 0.49 to ensure some headroom) Stephen :) |
Jimbocous Send message Joined: 1 Apr 13 Posts: 1853 Credit: 268,616,081 RAC: 1,349 |
..., on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file? Look in your app_info.xml (With Lunatics, you're using App_info rather than app_config.) For each app: To run 1 job per GPU, you'll see <coproc> <type>CUDA</type> <count>1.0</count> </coproc> To do 2 jobs per GPU, you'd want <coproc> <type>CUDA</type> <count>0.5</count> </coproc> For three it would be 0.33, etc. Perhaps someone with 1080s can tell you if it's better to do more than on, given the nature of the SoG app. My understanding was no. |
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13727 Credit: 208,696,464 RAC: 304 |
Command line settings such as -tt 1500 -hp -period_iterations_num 1 -high_perf -high_prec_timer -cpu_lock -sbs 2048 -spike_fft_thresh 4096 -tune 1 64 1 4 -oclfft_tune_gr 256 -oclfft_tune_lr 16 -oclfft_tune_wg 256 -oclfft_tune_ls 512 -oclfft_tune_bn 64 -oclfft_tune_cw 64 in the mb_cmdline_win_x86_SSE3_OpenCL_NV_SoG.txt in your Seti project directory would produce a lot more work per hour. Some say 2 at a time produces more work on high end hardware, I've personally found 1 at a time to produce the most work on my cards. All you can do is try the different settings & monitor the results- making sure to compare similar WUs. Grant Darwin NT |
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13727 Credit: 208,696,464 RAC: 304 |
Look in your app_info.xml (With Lunatics, you're using App_info rather than app_config.) It doesn't matter- app_config.xml is best as if you make a mistake you won't trash your entire cache of work. However app_config.xml won't work with older BOINC managers. <app_config> <app> <name>setiathome_v8</name> <gpu_versions> <gpu_usage>1.0</gpu_usage> <cpu_usage>1.0</cpu_usage> </gpu_versions> </app> <app> <name>astropulse_v7</name> <gpu_versions> <gpu_usage>0.5</gpu_usage> <cpu_usage>1.0</cpu_usage> </gpu_versions> </app> </app_config> works for me. Grant Darwin NT |
Jimbocous Send message Joined: 1 Apr 13 Posts: 1853 Credit: 268,616,081 RAC: 1,349 |
It doesn't matter- app_config.xml is best as if you make a mistake you won't trash your entire cache of work. Interesting. Somehow I had it in my head it was an either/or proposition. So I'm assuming the hierarchy is that app_config overrides anything present in app_info? |
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13727 Credit: 208,696,464 RAC: 304 |
So I'm assuming the hierarchy is that app_config overrides anything present in app_info? I think so... (it's been years since I've had to wrestle with app_info.xml). Other than not trashing your cache, the other good thing about it is you don't need to exit & restart BOINC for any changes to take effect, just "Options, Read config files" Grant Darwin NT |
Stephen "Heretic" Send message Joined: 20 Sep 12 Posts: 5557 Credit: 192,787,363 RAC: 628 |
..., on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file? . . Except he is now running SoG ... :( Stephen <shrug> |
Gammagoat Send message Joined: 19 Jul 07 Posts: 8 Credit: 936,684 RAC: 0 |
Stephen "Heretic", Grant (SSSF), Jimbocous, Zalster, Thank you all very much! Haven't had a chance to mess with .xml or cmd options yet but I will, once again thanks! |
Michael Christoffersen Send message Joined: 12 Nov 02 Posts: 3 Credit: 1,640,207 RAC: 0 |
Call me insane (I'm ok with that!) But... I run 8 WU's on my 7850/2gb (8 x 40min) and I run 4 WU's on my GTX 680/2gb (4 x 90min) Surely we're not going to argue which card does the job better. As far as I have read (like anywhere) Team green is capped at 4 WU's, and Team red is capped at 8-12 WU's. But I can be wrong, so correct me if I am wrong :) Anyway, have a nice day and all that :) |
Tom M Send message Joined: 28 Nov 02 Posts: 5124 Credit: 276,046,078 RAC: 462 |
Call me insane (I'm ok with that!) You did confirm that each increase in running gpu tasks in parallel increased your total production on the card? Sometimes if you will notice when you go from say 1 task to 2 tasks, the amount of time each task takes to run is more than double the time it takes 1 task to run. In that case, running tasks in parallel actually lowers your production. Each system/card is different. Experimentation and recording results has the final say. Tom A proud member of the OFA (Old Farts Association). |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.