SETI@home v8.22 Windows GPU applications support thread

Message boards : Number crunching : SETI@home v8.22 Windows GPU applications support thread
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 16 · 17 · 18 · 19 · 20 · 21 · 22 . . . 23 · Next

AuthorMessage
Profile TouchuvGrey
Volunteer tester
Avatar

Send message
Joined: 30 May 04
Posts: 10
Credit: 52,541,731
RAC: 39
Australia
Message 1979682 - Posted: 10 Feb 2019, 8:32:59 UTC - in response to Message 1978582.  

Hello Stephen:

i reinstalled afterburner, waited a few minutes and all good now.



Because the universe is just too big for us to be alone in.
ID: 1979682 · Report as offensive     Reply Quote
Javajav

Send message
Joined: 5 Feb 19
Posts: 4
Credit: 6,474
RAC: 0
Message 1982199 - Posted: 25 Feb 2019, 20:26:47 UTC - in response to Message 1979238.  

Sorry i took so long to reply. I managed to fix it myself. Turns out my GPU is so outdated that the new drivers just made it crap out. So I installed a legacy driver and now it seems to be working fine.
ID: 1982199 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 1982203 - Posted: 25 Feb 2019, 21:27:10 UTC - in response to Message 1982199.  

Sorry i took so long to reply. I managed to fix it myself. Turns out my GPU is so outdated that the new drivers just made it crap out. So I installed a legacy driver and now it seems to be working fine.


. . Maybe there is a glitch in Boinc/Seti but it only shows one rig, an i7-7700 with no GPU that has been sitting on 26 WUs for several weeks ...

Stephen

? ?
ID: 1982203 · Report as offensive     Reply Quote
Gammagoat

Send message
Joined: 19 Jul 07
Posts: 8
Credit: 936,684
RAC: 0
United States
Message 1997809 - Posted: 11 Jun 2019, 18:48:06 UTC

Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL.

Is there enough of a difference for me to make the change? Currently taking about 17-18 mins to complete one wu at a time on my 1080 Ti.
ID: 1997809 · Report as offensive     Reply Quote
Profile Jimbocous Project Donor
Volunteer tester
Avatar

Send message
Joined: 1 Apr 13
Posts: 1849
Credit: 268,616,081
RAC: 1,349
United States
Message 1997811 - Posted: 11 Jun 2019, 19:00:30 UTC - in response to Message 1997809.  

Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL.

Is there enough of a difference for me to make the change? Currently taking about 17-18 mins to complete one wu at a time on my 1080 Ti.

I think you'll find a difference. The good news is you should be able to just rerun the installer.
ID: 1997811 · Report as offensive     Reply Quote
Gammagoat

Send message
Joined: 19 Jul 07
Posts: 8
Credit: 936,684
RAC: 0
United States
Message 1997839 - Posted: 11 Jun 2019, 22:06:43 UTC - in response to Message 1997811.  

Seems I messed up when using Lunatics installer, I selected the Cuda app over the CL.

Is there enough of a difference for me to make the change? Currently taking about 17-18 mins to complete one wu at a time on my 1080 Ti.

I think you'll find a difference. The good news is you should be able to just rerun the installer.



Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now?

Sad watching my RX 580's wipe the floor with my 1080ti and they are only running 12 hrs a day.
ID: 1997839 · Report as offensive     Reply Quote
Profile Jimbocous Project Donor
Volunteer tester
Avatar

Send message
Joined: 1 Apr 13
Posts: 1849
Credit: 268,616,081
RAC: 1,349
United States
Message 1997841 - Posted: 11 Jun 2019, 22:13:00 UTC - in response to Message 1997839.  

Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now?.

Yes, running it dry would be prudent, though I think the old executable will remain it may no longer be referenced in the app_info.xml.
Could leave a couple there, and see what happens. Would be interesting to know.
l8r, Jim ...
ID: 1997841 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 1997845 - Posted: 11 Jun 2019, 22:34:35 UTC - in response to Message 1997841.  

Thank you, should I set Seti to no more work and rerun installer after current wu's are finished? I assume by rerunning installer it will overwrite what I have now?.

Yes, running it dry would be prudent, though I think the old executable will remain it may no longer be referenced in the app_info.xml.
Could leave a couple there, and see what happens. Would be interesting to know.
l8r, Jim ...


. . The Lunatics installer creates a very large app_info.xml with every required form of app so the existing WUs will not bomb out. They will still show as cuda but will be processed by the selected app, SoG r3557 does good work.

Stephen

:)
ID: 1997845 · Report as offensive     Reply Quote
Gammagoat

Send message
Joined: 19 Jul 07
Posts: 8
Credit: 936,684
RAC: 0
United States
Message 1997856 - Posted: 11 Jun 2019, 23:26:42 UTC
Last modified: 11 Jun 2019, 23:32:37 UTC

Good Lord, that's better. like about 1/3rd the time as before.

I have a machine with a 1060 and it only got a single cpu wu's, project says out of gpu work? Ok, but why send me cpu work? Ran the installer, choose wisely this time, no GPU work?

I get this,
6/11/2019 4:20:04 PM | SETI@home | Finished upload of 25jl12ae.15192.21335.5.32.169_0_r1979829862_0
6/11/2019 4:20:23 PM | SETI@home | update requested by user
6/11/2019 4:20:26 PM | SETI@home | Sending scheduler request: Requested by user.
6/11/2019 4:20:26 PM | SETI@home | Reporting 1 completed tasks
6/11/2019 4:20:26 PM | SETI@home | Requesting new tasks for NVIDIA GPU
6/11/2019 4:20:27 PM | SETI@home | Scheduler request completed: got 0 new tasks
6/11/2019 4:20:27 PM | SETI@home | No tasks sent
6/11/2019 4:20:27 PM | SETI@home | No tasks are available for SETI@home v8
6/11/2019 4:20:27 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them
6/11/2019 4:20:27 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them
6/11/2019 4:20:27 PM | SETI@home | This computer has finished a daily quota of 1 tasks
6/11/2019 4:25:31 PM | SETI@home | Sending scheduler request: To fetch work.
6/11/2019 4:25:31 PM | SETI@home | Requesting new tasks for NVIDIA GPU
6/11/2019 4:25:32 PM | SETI@home | Scheduler request completed: got 0 new tasks
6/11/2019 4:25:32 PM | SETI@home | No tasks sent
6/11/2019 4:25:32 PM | SETI@home | No tasks are available for SETI@home v8
6/11/2019 4:25:32 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them
6/11/2019 4:25:32 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them
6/11/2019 4:25:32 PM | SETI@home | This computer has finished a daily quota of 1 tasks


Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file?
ID: 1997856 · Report as offensive     Reply Quote
Profile Zalster Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 27 May 99
Posts: 5517
Credit: 528,817,460
RAC: 242
United States
Message 1997858 - Posted: 11 Jun 2019, 23:50:54 UTC - in response to Message 1997856.  

6/11/2019 4:20:27 PM | SETI@home | This computer has finished a daily quota of 1 tasks
6/11/2019 4:25:31 PM | SETI@home | Sending scheduler request: To fetch work.
6/11/2019 4:25:31 PM | SETI@home | Requesting new tasks for NVIDIA GPU
6/11/2019 4:25:32 PM | SETI@home | Scheduler request completed: got 0 new tasks
6/11/2019 4:25:32 PM | SETI@home | No tasks sent


This means that computer is in the penalty box and restricted to how many task it will get in a day. Either there were a lot of errors or invalids that cause the server to think something was wrong with the computer. As you complete and get validate tasks, the limit will slowly increase. But this will take time.

Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file?


You will need to make an app_config.xml. I don't have access to my old files with the required app_config but I'm sure someone will post it here for you. You will need to place it in the Seti@home folder and turn off BOINC and restart it. If all goes well, you should see that BOINC recognizes the app_config.xml and implement the changes.
ID: 1997858 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 1997860 - Posted: 11 Jun 2019, 23:59:28 UTC - in response to Message 1997856.  
Last modified: 12 Jun 2019, 0:05:27 UTC

Good Lord, that's better. like about 1/3rd the time as before.

I have a machine with a 1060 and it only got a single cpu wu's, project says out of gpu work? Ok, but why send me cpu work? Ran the installer, choose wisely this time, no GPU work?

I get this,
6/11/2019 4:20:04 PM | SETI@home | Finished upload of 25jl12ae.15192.21335.5.32.169_0_r1979829862_0
6/11/2019 4:20:23 PM | SETI@home | update requested by user
6/11/2019 4:20:26 PM | SETI@home | Sending scheduler request: Requested by user.
6/11/2019 4:20:26 PM | SETI@home | Reporting 1 completed tasks
6/11/2019 4:20:26 PM | SETI@home | Requesting new tasks for NVIDIA GPU
6/11/2019 4:20:27 PM | SETI@home | Scheduler request completed: got 0 new tasks
6/11/2019 4:20:27 PM | SETI@home | No tasks sent
6/11/2019 4:20:27 PM | SETI@home | No tasks are available for SETI@home v8
6/11/2019 4:20:27 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them
6/11/2019 4:20:27 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them
6/11/2019 4:20:27 PM | SETI@home | This computer has finished a daily quota of 1 tasks
6/11/2019 4:25:31 PM | SETI@home | Sending scheduler request: To fetch work.
6/11/2019 4:25:31 PM | SETI@home | Requesting new tasks for NVIDIA GPU
6/11/2019 4:25:32 PM | SETI@home | Scheduler request completed: got 0 new tasks
6/11/2019 4:25:32 PM | SETI@home | No tasks sent
6/11/2019 4:25:32 PM | SETI@home | No tasks are available for SETI@home v8
6/11/2019 4:25:32 PM | SETI@home | Tasks for CPU are available, but your preferences are set to not accept them
6/11/2019 4:25:32 PM | SETI@home | Tasks for Intel GPU are available, but your preferences are set to not accept them
6/11/2019 4:25:32 PM | SETI@home | This computer has finished a daily quota of 1 tasks

. . OK did you select a GPU for crunching when you ran the Lunatics installer? Also does the video driver on that machine have support for OpenCL? M$ stuffs things up constantly by "updating" video drivers with their own version which does not have OpenCL support. If you open your BOINC event log file go right back to the beginning and confirm that it sees drivers for OpenCL. If that has been overwritten because Boinc has been running for longer than the buffer holds close Boinc and re-open it then check the log. Or, as Zalster wrote, your host has been sent to the 'sin bin' due to too many errors, have you had a large number of errors or trashed WUs?

Also, on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file?

. . Actually you don't need to touch app_info.xml. There is a nice little file called app_config.xml which you can edit with Notepad. If it is empty add the following ...

<app_config>
    <app_version>
    <app_name>setiathome_v8</app_name>
    <plan_class>opencl_nvidia_SoG</plan_class>
    <avg_ncpus>0.25</avg_ncpus>
    <ngpus>0.49</ngpus>
    </app_version>
    <app_version>
    <app_name>astropulse_v7</app_name>
    <plan_class>opencl_nvidia_100</plan_class>
    <avg_ncpus>0.25</avg_ncpus>
    <ngpus>0.99</ngpus>
    </app_version>
    </app_config>


. . If there is something similar there already then change the line <ngpus>1</ngpus> from 1 to 0.5 (I like to use 0.49 to ensure some headroom)

Stephen

:)
ID: 1997860 · Report as offensive     Reply Quote
Profile Jimbocous Project Donor
Volunteer tester
Avatar

Send message
Joined: 1 Apr 13
Posts: 1849
Credit: 268,616,081
RAC: 1,349
United States
Message 1997886 - Posted: 12 Jun 2019, 2:58:12 UTC - in response to Message 1997856.  
Last modified: 12 Jun 2019, 2:58:42 UTC

..., on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file?


Look in your app_info.xml (With Lunatics, you're using App_info rather than app_config.)
For each app:
To run 1 job per GPU, you'll see
<coproc>
<type>CUDA</type>
<count>1.0</count>
</coproc>
To do 2 jobs per GPU, you'd want
<coproc>
<type>CUDA</type>
<count>0.5</count>
</coproc>
For three it would be 0.33, etc.

Perhaps someone with 1080s can tell you if it's better to do more than on, given the nature of the SoG app. My understanding was no.
ID: 1997886 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1997898 - Posted: 12 Jun 2019, 4:56:05 UTC

Command line settings such as
-tt 1500 -hp -period_iterations_num 1 -high_perf -high_prec_timer -cpu_lock -sbs 2048 -spike_fft_thresh 4096 -tune 1 64 1 4 -oclfft_tune_gr 256 -oclfft_tune_lr 16 -oclfft_tune_wg 256 -oclfft_tune_ls 512 -oclfft_tune_bn 64 -oclfft_tune_cw 64

in the
mb_cmdline_win_x86_SSE3_OpenCL_NV_SoG.txt
in your Seti project directory would produce a lot more work per hour.
Some say 2 at a time produces more work on high end hardware, I've personally found 1 at a time to produce the most work on my cards. All you can do is try the different settings & monitor the results- making sure to compare similar WUs.
Grant
Darwin NT
ID: 1997898 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1997899 - Posted: 12 Jun 2019, 4:58:58 UTC - in response to Message 1997886.  
Last modified: 12 Jun 2019, 5:03:11 UTC

Look in your app_info.xml (With Lunatics, you're using App_info rather than app_config.)

It doesn't matter- app_config.xml is best as if you make a mistake you won't trash your entire cache of work.
However app_config.xml won't work with older BOINC managers.


<app_config>
 <app>
  <name>setiathome_v8</name>
  <gpu_versions>
  <gpu_usage>1.0</gpu_usage>
  <cpu_usage>1.0</cpu_usage>
  </gpu_versions>
 </app>
 <app>
  <name>astropulse_v7</name>
  <gpu_versions>
  <gpu_usage>0.5</gpu_usage>
  <cpu_usage>1.0</cpu_usage>
  </gpu_versions>
 </app>
</app_config>

works for me.
Grant
Darwin NT
ID: 1997899 · Report as offensive     Reply Quote
Profile Jimbocous Project Donor
Volunteer tester
Avatar

Send message
Joined: 1 Apr 13
Posts: 1849
Credit: 268,616,081
RAC: 1,349
United States
Message 1997902 - Posted: 12 Jun 2019, 5:13:49 UTC - in response to Message 1997899.  
Last modified: 12 Jun 2019, 5:17:51 UTC

It doesn't matter- app_config.xml is best as if you make a mistake you won't trash your entire cache of work.
However app_config.xml won't work with older BOINC managers.

Interesting. Somehow I had it in my head it was an either/or proposition.
So I'm assuming the hierarchy is that app_config overrides anything present in app_info?
ID: 1997902 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13720
Credit: 208,696,464
RAC: 304
Australia
Message 1997903 - Posted: 12 Jun 2019, 5:22:36 UTC - in response to Message 1997902.  

So I'm assuming the hierarchy is that app_config overrides anything present in app_info?

I think so... (it's been years since I've had to wrestle with app_info.xml).
Other than not trashing your cache, the other good thing about it is you don't need to exit & restart BOINC for any changes to take effect, just "Options, Read config files"
Grant
Darwin NT
ID: 1997903 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 1997931 - Posted: 12 Jun 2019, 15:01:29 UTC - in response to Message 1997886.  
Last modified: 12 Jun 2019, 15:02:00 UTC

..., on my 1080ti, how would I rub two wu's, I know that I'll have to add something to app_config.xml, but that file is now pretty intimidating. Does it matter where I place code in file?


Look in your app_info.xml (With Lunatics, you're using App_info rather than app_config.)
For each app:
To run 1 job per GPU, you'll see
<coproc>
<type>CUDA</type>
<count>1.0</count>
</coproc>
To do 2 jobs per GPU, you'd want
<coproc>
<type>CUDA</type>
<count>0.5</count>
</coproc>
For three it would be 0.33, etc.

Perhaps someone with 1080s can tell you if it's better to do more than on, given the nature of the SoG app. My understanding was no.


. . Except he is now running SoG ... :(

Stephen

<shrug>
ID: 1997931 · Report as offensive     Reply Quote
Gammagoat

Send message
Joined: 19 Jul 07
Posts: 8
Credit: 936,684
RAC: 0
United States
Message 1998111 - Posted: 13 Jun 2019, 22:58:54 UTC

Stephen "Heretic", Grant (SSSF), Jimbocous, Zalster, Thank you all very much!

Haven't had a chance to mess with .xml or cmd options yet but I will, once again thanks!
ID: 1998111 · Report as offensive     Reply Quote
Profile Michael Christoffersen
Avatar

Send message
Joined: 12 Nov 02
Posts: 3
Credit: 1,640,207
RAC: 0
Denmark
Message 1999906 - Posted: 27 Jun 2019, 12:25:30 UTC

Call me insane (I'm ok with that!)

But... I run 8 WU's on my 7850/2gb (8 x 40min) and I run 4 WU's on my GTX 680/2gb (4 x 90min)

Surely we're not going to argue which card does the job better.

As far as I have read (like anywhere) Team green is capped at 4 WU's, and Team red is capped at 8-12 WU's.
But I can be wrong, so correct me if I am wrong :)

Anyway, have a nice day and all that :)
ID: 1999906 · Report as offensive     Reply Quote
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 1999909 - Posted: 27 Jun 2019, 12:33:25 UTC - in response to Message 1999906.  
Last modified: 27 Jun 2019, 12:36:35 UTC

Call me insane (I'm ok with that!)

But... I run 8 WU's on my 7850/2gb (8 x 40min) and I run 4 WU's on my GTX 680/2gb (4 x 90min)

Surely we're not going to argue which card does the job better.

As far as I have read (like anywhere) Team green is capped at 4 WU's, and Team red is capped at 8-12 WU's.
But I can be wrong, so correct me if I am wrong :)

Anyway, have a nice day and all that :)


You did confirm that each increase in running gpu tasks in parallel increased your total production on the card?
Sometimes if you will notice when you go from say 1 task to 2 tasks, the amount of time each task takes to run is more than double the time it takes 1 task to run.
In that case, running tasks in parallel actually lowers your production.

Each system/card is different. Experimentation and recording results has the final say.

Tom
A proud member of the OFA (Old Farts Association).
ID: 1999909 · Report as offensive     Reply Quote
Previous · 1 . . . 16 · 17 · 18 · 19 · 20 · 21 · 22 . . . 23 · Next

Message boards : Number crunching : SETI@home v8.22 Windows GPU applications support thread


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.