Multiple Video Cards

Questions and Answers : Windows : Multiple Video Cards
Message board moderation

To post messages, you must log in.

AuthorMessage
dlgray

Send message
Joined: 24 Jul 08
Posts: 3
Credit: 8,725,678
RAC: 1
United States
Message 1760131 - Posted: 29 Jan 2016, 0:16:13 UTC

I want to add an additional video card for greater computational ability and get SETI to use it. Anyone know how to do this.
ID: 1760131 · Report as offensive
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1760169 - Posted: 29 Jan 2016, 2:14:24 UTC - in response to Message 1760131.  
Last modified: 29 Jan 2016, 2:33:43 UTC

Anyone know how to do this.

Not enough info (you don't need "advice" like "buy the card and put it in computer", right?)

http://setiathome.berkeley.edu/show_host_detail.php?hostid=6191919

The best place to ask complex Hardware questions is Number Crunching

Be prepared to give info like:
Your PSU - exact model or at least Watts, brand, ... (all you know about it)
Motherboard - exact model

If you pair NVIDIA Quadro 4000 with the same card - BOINC will use both automatically.
If the GPU/card is different - there is Setting to tell BOINC to use both GPUs

cc_config.xml
<use_all_gpus>1</use_all_gpus>

(I think you may pair with some GeForce, best to be the same/similar generation as your NVIDIA Quadro 4000; if you want to pair with ATI AMD it will be more problematic despite possible)
 
 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1760169 · Report as offensive
Profile Alienmoon
Avatar

Send message
Joined: 14 Oct 13
Posts: 14
Credit: 386,618
RAC: 0
United Kingdom
Message 1760348 - Posted: 29 Jan 2016, 13:23:31 UTC - in response to Message 1760169.  

cc_config.xml
<use_all_gpus>1</use_all_gpus>



Thank you for that bit of information, mine was on "0" for some reason, sorted now.

Thanks again..

Gary
How can we introduce an Alien Race to the people of Earth, Without the power-hungry Governments of this Planet fighting for control & Technology? all because people Fear what they do NOT Understand!
ID: 1760348 · Report as offensive
Profile Jord
Volunteer tester
Avatar

Send message
Joined: 9 Jun 99
Posts: 15184
Credit: 4,362,181
RAC: 3
Netherlands
Message 1760355 - Posted: 29 Jan 2016, 13:59:24 UTC - in response to Message 1760348.  

cc_config.xml isn't by default available under BOINC, you have to make it by hand, or add a debug flag through the Event Log Options window, or set an exclusive CPU or GPU application for a full cc_config.xml file to be written.

When it's written by BOINC Manager, all the values in it will be the default values. <use_all_gpus>0</use_all_gpus> is the default value for this setting, as BOINC will only use the best GPU in the system by default. If all GPUs are the same, with the same compute capability, software version, available memory and speed factors, it'll use all those GPUs.

BOINC will use only the best capable GPU by default.
If there are two of the same GPUs in the system but the values of compute capability, software version, available memory and speed differ they're seen as two different GPUs and only the best capable of the two is used.
ID: 1760355 · Report as offensive
dlgray

Send message
Joined: 24 Jul 08
Posts: 3
Credit: 8,725,678
RAC: 1
United States
Message 1760392 - Posted: 29 Jan 2016, 15:54:51 UTC - in response to Message 1760169.  
Last modified: 29 Jan 2016, 15:55:32 UTC

Thanks,

You answered my question perfectly. I created the cc_config.xml file and saved it as a text file with the extension xml. One other question though, inside the xml file, does the statement <use_all_gpus>1</use_all_gpus> have the brackets at the ends. I have not seen the brackets used before.
ID: 1760392 · Report as offensive
Profile Jord
Volunteer tester
Avatar

Send message
Joined: 9 Jun 99
Posts: 15184
Credit: 4,362,181
RAC: 3
Netherlands
Message 1760397 - Posted: 29 Jan 2016, 16:13:57 UTC - in response to Message 1760392.  
Last modified: 29 Jan 2016, 16:16:59 UTC

Your cc_config.xml file needs to have all of these entries:
<cc_config>
<log_flags>
</log_flags>
<options>
<use_all_gpus>1</use_all_gpus>
</options>
</cc_config>

When writing it by hand it is best written in Notepad or another ASCII text editor, and saved as an all files file named cc_config.xml, in ANSI format (not Unicode or UTF-8). It needs to be written to the BOINC data directory, by default a hidden directory at C:\Programdata\BOINC\

Some will argue that my use of <log_flags></log_flags> isn't really needed in the above, but I always add it so to teach people what the order of things is in the client configuration file.

Afterwards to get it to be used, you need to exit BOINC completely and restart it. GPU detection decisions are only made at BOINC startup.

Any square or other brackets used on the client configuration manual page in the Wiki are used to show additional optional settings. Brackets of any sort aren't used in the file. Tags/flag names are started with the lower than '<' sign and ended with the larger than '>' sign.
ID: 1760397 · Report as offensive
dlgray

Send message
Joined: 24 Jul 08
Posts: 3
Credit: 8,725,678
RAC: 1
United States
Message 1760785 - Posted: 30 Jan 2016, 17:17:17 UTC - in response to Message 1760397.  

I finally got both GPUs up and running and was amazed by BOINC's compute rating. For the new Quadro 4000 it shows a computer capability of 2.0 and for my old Quadro 600 a slightly higher rating of 2.1. I would have thought the 4000 would run circles around the 600. My monitor is connected to the 4000 and I wonder if that is why.
ID: 1760785 · Report as offensive
Profile BilBg
Volunteer tester
Avatar

Send message
Joined: 27 May 07
Posts: 3720
Credit: 9,385,827
RAC: 0
Bulgaria
Message 1761189 - Posted: 1 Feb 2016, 1:49:35 UTC - in response to Message 1760785.  
Last modified: 1 Feb 2016, 2:13:09 UTC

For the new Quadro 4000 it shows a computer capability of 2.0 and for my old Quadro 600 a slightly higher rating of 2.1

This is "Compute Capability" and is not "rating" (i.e. do not show how fast GPU computes but what it can do)

The SETI@home cuda50 app show the same (computeCap 2.0 vs 2.1):
http://setiathome.berkeley.edu/result.php?resultid=4689074722
setiathome_CUDA: Found 2 CUDA device(s):
nVidia Driver Version 354.56
  Device 1: Quadro 4000, 2048 MiB, regsPerBlock 32768
     computeCap 2.0, multiProcs 8 
     pciBusID = 1, pciSlotID = 0
     clockRate = 950 MHz
  Device 2: Quadro 600, 1024 MiB, regsPerBlock 32768
     computeCap 2.1, multiProcs 2 
     pciBusID = 2, pciSlotID = 0
     clockRate = 1280 MHz


(the speed depends on multiProcs and clockRate)

First column here:
https://en.wikipedia.org/wiki/CUDA#Supported_GPUs

In "Feature support" the column is "2.x" so probably not much difference on "Features" between computeCap 2.0 and computeCap 2.1
https://en.wikipedia.org/wiki/CUDA#Version_features_and_specifications

The only difference is in the last table "Architecture specifications"
 
 


- ALF - "Find out what you don't do well ..... then don't do it!" :)
 
ID: 1761189 · Report as offensive

Questions and Answers : Windows : Multiple Video Cards


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.