More than one WU per GPU at the same time ?

Questions and Answers : GPU applications : More than one WU per GPU at the same time ?
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4 · 5 · Next

AuthorMessage
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1115282 - Posted: 10 Jun 2011, 2:32:26 UTC - in response to Message 1111603.  

*EDIT* - gave Open Hardware Monitor a try, and unfortunately like GPU-Z, it doesn't display GPU memory consumption for ATI GPUs. not the end of the world though...plus i like the interface, so i may end up using it. i'm starting to think that ATI's Catalyst drivers just aren't designed with such a parameter for GPU utilities to display, unlike nVidia's GeForce drivers.


I have issues with Open Hardware Monitor Version 0.3.2 Beta:
http://setiathome.berkeley.edu/forum_thread.php?id=62044&nowrap=true#1115279


 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1115282 · Report as offensive
Profile Fred E.
Volunteer tester

Send message
Joined: 22 Jul 99
Posts: 768
Credit: 24,140,697
RAC: 0
United States
Message 1119815 - Posted: 21 Jun 2011, 22:13:13 UTC - in response to Message 1111732.  
Last modified: 21 Jun 2011, 22:30:38 UTC

you see, i don't actually have both the discrete GPU and the integrated GPU hooked up to the monitor - only the integrated GPU is hooked up to the monitor, while the discrete GPU is simply attached to a dummy plug. by using my integrated 3300 GPU strictly for display purposes, and by using my discrete 5870 GPU strictly for crunching, i more or less eliminate the GUI lag so many people complain about when they try to crunch DC work on the same GPU that runs his/her display.


I want to do something similar. Instead of trying to maka a dummy plug (my friends call me Ten Thumbs) can I dual connect with VGA from integrated GPU and DVI from Video card? Monitor has both inputs and an auto detect button for analog vs digital input, and I know how to set BIOS priorities. Should I buy the extra cable and try it, or is that hopeless? If hopeless, anywhere I can find a dummy plug to buy? I searched the net and several big on line stores with no luck. Maybe another name?

BTW as to the subject of this thread, I tried 2 at a time with my low end GT240, but results were not very good. I may try again later this week as I just installed Lunatics v.038 and it does speed up my GPU tasks.
Another Fred
Support SETI@home when you search the Web with GoodSearch or shop online with GoodShop.
ID: 1119815 · Report as offensive
Profile Sunny129
Avatar

Send message
Joined: 7 Nov 00
Posts: 190
Credit: 3,163,755
RAC: 0
United States
Message 1119837 - Posted: 21 Jun 2011, 22:59:29 UTC - in response to Message 1119815.  
Last modified: 21 Jun 2011, 23:00:47 UTC

I want to do something similar. Instead of trying to maka a dummy plug (my friends call me Ten Thumbs) can I dual connect with VGA from integrated GPU and DVI from Video card? Monitor has both inputs and an auto detect button for analog vs digital input, and I know how to set BIOS priorities. Should I buy the extra cable and try it, or is that hopeless? If hopeless, anywhere I can find a dummy plug to buy? I searched the net and several big on line stores with no luck. Maybe another name?

let me start by saying that the reason i had to use a dummy plug is b/c my monitor only has 2 inputs (1 VGA and 1 DVI), and the integrated GPU is on the DVI b/c again, i actually use it for the display. that leaves only the VGA port for the discrete GPU, an HD 5870 2GB Eyefinity card with 6 miniDP outs only...and of course none of the adapters that came with the card are miniDP-to-VGA. i searched online for such an adapter, but to no avail...hence the reason i ended up having to use a dummy plug.

which brings me to my next point - i don't think you can buy a dummy plug - you have to make one. don't worry, its easy...especially if you don't end up having to jump through the hoops that i did. assuming you have a more typical video card with some combination of the more normal VGA/DVI/HDMI outputs, you won't have to spend time searching for a special adapter like i did for my video card. all you really need is a standard VGA-to-DVI adapter and some resistors. of course this method would require you to put the resistors in the DVI-end of the adapter, and plug the VGA-end of it into the VGA port on the video card, forcing you to use the DVI for the integrated video...which is actually a good thing anyways, b/c DVI has much better image quality anyways. since i don't recall the specifics, here's a link to the DIY dummy plug article that i used as a guide while making mine:

http://www.overclock.net/folding-home-guides-tutorials/384733-30-second-dummy-plug.html

having said all that, i don't really know for sure if connecting dual GPUs to the monitor will work or not, b/c i never had the adapter to try it with the video card i have. but my guess is that it would work, or is certainly worth a try at the very least. the idea behind the dummy plug is to trick the GPU into thinking its connected to a display (by shorting specific pins with resistors on the display-end of the adapter). so i don't think it matters whether you're connecting the video card to an active or inactive input on the monitor, so long as its a monitor. look at it this way - my integrated GPU is using my monitor's DVI input for the actual display, and my video card therefore not my primary GPU in the BIOS. yet, it thinks its connected to a monitor (even though its not), and works flawlessly while punching out ~220,000 RAC daily...

give it a shot...i'll bet it works for you.
ID: 1119837 · Report as offensive
Profile Fred E.
Volunteer tester

Send message
Joined: 22 Jul 99
Posts: 768
Credit: 24,140,697
RAC: 0
United States
Message 1119846 - Posted: 21 Jun 2011, 23:29:45 UTC - in response to Message 1119837.  

Thanks for the comments. Card specs say
EVGA 01G-P3-1235-LR GeForce GT 240 Video Card - 1024MB DDR3, PCI-Express 2.0, DVI, HDMI, VGA

I may try the dual connect method first - it would also set me up to change things around if my grandson vitits and wants to play games. I've seen the instructions for the dummy plug, and I can do that if I have to.


Another Fred
Support SETI@home when you search the Web with GoodSearch or shop online with GoodShop.
ID: 1119846 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1147335 - Posted: 31 Aug 2011, 2:58:40 UTC - in response to Message 1078217.  

Where the app_info.xml file is?

Using the Lunatics' Unified Installer v0.38

Can I run 2 WU in the GTX560?


ID: 1147335 · Report as offensive
Profile Gundolf Jahn

Send message
Joined: 19 Sep 00
Posts: 3184
Credit: 446,358
RAC: 0
Germany
Message 1147403 - Posted: 31 Aug 2011, 7:27:41 UTC - in response to Message 1147335.  

Where the app_info.xml file is?

Where the Lunatics' Unified Installer v0.38 has placed it. ;-)

In projects\setiathome.berkeley.edu beneath your BOINC data directory.

Can I run 2 WU in the GTX560?

If you know how to edit your app_info.xml file, then yes, since 1023 MB seem to be sufficient for GPU memory.

Gruß,
Gundolf
ID: 1147403 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1147431 - Posted: 31 Aug 2011, 10:24:37 UTC - in response to Message 1147403.  

I try to find the file app_info.xml file but it does not apears in the directory

Z:\Boinc\projects\setiathome.berkeley.edu (my Boinc Data Directory)

Its seems like my instaler does not create the file.

Yes, i know how to edit the file, i just cant find where it is.

Thanks for the help.





ID: 1147431 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1147447 - Posted: 31 Aug 2011, 11:26:07 UTC - in response to Message 1147431.  

Reinstalling the Lunatics' Unified Installer v0.38 fix the problem.

The file now apears and i am running 2 WU in each card

Thanks again



ID: 1147447 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1149131 - Posted: 5 Sep 2011, 10:20:35 UTC

Does anybody knows who many task an GTX560 or a GTS250 can handle realy?
ID: 1149131 · Report as offensive
Profile perryjay
Volunteer tester
Avatar

Send message
Joined: 20 Aug 02
Posts: 3377
Credit: 20,676,751
RAC: 0
United States
Message 1149333 - Posted: 5 Sep 2011, 20:41:47 UTC - in response to Message 1149131.  

Your GTX 560 should run three easily (.33) and possibly four. (.25) Your GTS 250 might run two but keep an eye on the time. They aren't really great at running more than one at a time. You may find it takes more time running two at a time than it does running two of them one at a time.


PROUD MEMBER OF Team Starfire World BOINC
ID: 1149333 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1149409 - Posted: 6 Sep 2011, 0:55:36 UTC

Thanks for the info.

Just for help another who have the same insue.

In the GTX560 Running

2 WU per GPU takes +/- 18 min to compleate each (90% GPU load)

3 WU per GPU takes +/- 23 min (99% GPU load)

4 WU per GPU not tried.

In the GTS250, 2 WU per GPU takes more time than one at a time.

ID: 1149409 · Report as offensive
Profile Sutaru Tsureku
Volunteer tester

Send message
Joined: 6 Apr 07
Posts: 7105
Credit: 147,663,825
RAC: 5
Germany
Message 1149871 - Posted: 7 Sep 2011, 21:16:13 UTC - in response to Message 1149409.  
Last modified: 7 Sep 2011, 21:18:43 UTC

Yes, because the GTS250 is a non-Fermi grafic card.

2+ CUDA WUs/GPU work only with Fermi (e.g. GTX4xx/5xx) grafic cards.


- Best regards! - Sutaru Tsureku, team seti.international founder. - Optimize your PC for higher RAC. - SETI@home needs your help. -
ID: 1149871 · Report as offensive
Profile Edward Manteufel

Send message
Joined: 1 Jun 00
Posts: 27
Credit: 26,211,273
RAC: 11
United States
Message 1155523 - Posted: 24 Sep 2011, 4:42:51 UTC

Hi, ya i was wondering a related issue
I have 2 GTX280s and an amd phenom II X3
and i notice i have five cuda work units running 4 in normal and 1 in high priority. and i dont see any cpu work units going.
each cuda unit says using .68 CPU
ok so i never edited any files myslef so why are there 5 workunits running. and also i used gpuz and saw that only my first card is being used at 75% and the other card is at 0% is it possible 1 card is running 5 WU's and then why isnt the second one doing anything
thanks
ID: 1155523 · Report as offensive
Profile Gundolf Jahn

Send message
Joined: 19 Sep 00
Posts: 3184
Credit: 446,358
RAC: 0
Germany
Message 1155593 - Posted: 24 Sep 2011, 11:01:18 UTC - in response to Message 1155523.  

You should restart your BOINC client (or reboot) and then post the content of your BOINC event log here (about 30 lines), to give us more information to work on.

Gruß,
Gundolf
ID: 1155593 · Report as offensive
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1155852 - Posted: 25 Sep 2011, 4:46:31 UTC - in response to Message 1155523.  
Last modified: 25 Sep 2011, 5:17:46 UTC


Phenom II X3 has 3 cores

If "five CUDA work units running" BOINC will think this:
0.68 CPU * 5 CUDA tasks = 3.4 cores (CPUs) "needed"
so BOINC will not run any CPU tasks

(in fact 1 CUDA task will use only 5-10% of CPU core (= 0.05-0.10 CPU) but the setting/info that it "needs" 0.68 CPU is defined/set/send to anybody using the standard CUDA apps by the SETI servers)

But how BOINC decided to run at once 5 CUDA tasks on 2 GPUs without your intervention is beyond me.


Edit:
Stderr shows that 3 GPUs are detected:
setiathome_CUDA: Found 3 CUDA device(s):
   Device 1 : GeForce GTX 280 
           totalGlobalMem = 1058865152 
           sharedMemPerBlock = 16384 
           regsPerBlock = 16384 
           warpSize = 32 
           memPitch = 2147483647 
           maxThreadsPerBlock = 512 
           clockRate = 1296000 
           totalConstMem = 65536 
           major = 1 
           minor = 3 
           textureAlignment = 256 
           deviceOverlap = 1 
           multiProcessorCount = 30 
   Device 2 : GeForce GTX 280 
           totalGlobalMem = 1058865152 
           sharedMemPerBlock = 16384 
           regsPerBlock = 16384 
           warpSize = 32 
           memPitch = 2147483647 
           maxThreadsPerBlock = 512 
           clockRate = 1350000 
           totalConstMem = 65536 
           major = 1 
           minor = 3 
           textureAlignment = 256 
           deviceOverlap = 1 
           multiProcessorCount = 30 
   Device 3 : nForce 980a/780a SLI 
           totalGlobalMem = 521732096 
           sharedMemPerBlock = 16384 
           regsPerBlock = 8192 
           warpSize = 32 
           memPitch = 2147483647 
           maxThreadsPerBlock = 512 
           clockRate = 1200000 
           totalConstMem = 65536 
           major = 1 
           minor = 1 
           textureAlignment = 256 
           deviceOverlap = 0 
           multiProcessorCount = 1 
setiathome_CUDA: No device specified, determined to use CUDA device 1: GeForce GTX 280
SETI@home using CUDA accelerated device GeForce GTX 280


Strange part here is "No device specified" (meaning: BOINC started CUDA task but "forget" to tell the setiathome_CUDA app what to use to run it)
And because of this it seems that many of your tasks show "determined to use CUDA device 1" (determined by setiathome_CUDA app, not by BOINC)

(if you don't use the integrated in the chipset nForce 980a/780a SLI you can disable it in the BIOS)


But sometimes the info is:
setiathome_CUDA: Found 1 CUDA device(s):
   Device 1 : GeForce GTX 280 
           totalGlobalMem = 1026228224 
           sharedMemPerBlock = 16384 
           regsPerBlock = 16384 
           warpSize = 32 
           memPitch = 2147483647 
           maxThreadsPerBlock = 512 
           clockRate = 1296000 
           totalConstMem = 65536 
           major = 1 
           minor = 3 
           textureAlignment = 256 
           deviceOverlap = 1 
           multiProcessorCount = 30 
setiathome_CUDA: CUDA Device 1 specified, checking...
   Device 1: GeForce GTX 280 is okay
SETI@home using CUDA accelerated device GeForce GTX 280


Do you often enable/disable GPUs or SLI?
(or you added the second GeForce GTX 280 recently?)


 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1155852 · Report as offensive
Profile bloodrain
Volunteer tester
Avatar

Send message
Joined: 8 Dec 08
Posts: 231
Credit: 28,112,547
RAC: 1
Antarctica
Message 1155869 - Posted: 25 Sep 2011, 6:32:35 UTC - in response to Message 1078159.  

yes but you need more then 1 video card
ID: 1155869 · Report as offensive
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1155870 - Posted: 25 Sep 2011, 6:47:29 UTC - in response to Message 1155869.  

yes but you need more then 1 video card

NO, you don't need "more then 1 video card" (to run "More than one WU per GPU at the same time")

(and TOM's question is answered long ago)


 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1155870 · Report as offensive
Profile Edward Manteufel

Send message
Joined: 1 Jun 00
Posts: 27
Credit: 26,211,273
RAC: 11
United States
Message 1155996 - Posted: 25 Sep 2011, 15:40:53 UTC - in response to Message 1155870.  

ya i recently switched from 9800gtx+ cards but i do use the integrated chip for the main monitor
ID: 1155996 · Report as offensive
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1156137 - Posted: 26 Sep 2011, 0:16:30 UTC - in response to Message 1155996.  


You forget to follow Gundolf's suggestion:
Post the first ~40 lines from BOINC "Event log"

Press Ctrl-Shift-E (or find "Event log" in the menus)
Mark the first ~40 lines (Click ... Shift-Click)
There have to be button [Copy selected]

Without that info we can't think much about your problem.


 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1156137 · Report as offensive
Aaron

Send message
Joined: 19 Jan 11
Posts: 17
Credit: 226,460
RAC: 0
Australia
Message 1172649 - Posted: 21 Nov 2011, 5:37:51 UTC

I have a Nvidia geforce gts 240. it currently does 1 wu at a time. I have followed this thread to see how to increase to 2 wu's at a time. i have tried to do it using lunitics and changing the app_info file to .5 but my gpu still only does 1 wu at a time.
Why is this happening. Any help would be appreciated.
ID: 1172649 · Report as offensive
Previous · 1 · 2 · 3 · 4 · 5 · Next

Questions and Answers : GPU applications : More than one WU per GPU at the same time ?


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.