How do I control usage on GPU's?

Questions and Answers : Windows : How do I control usage on GPU's?
Message board moderation

To post messages, you must log in.

AuthorMessage
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1902875 - Posted: 25 Nov 2017, 15:05:10 UTC

I have 2 GTX 1080ti cards and letting BOINC use 100% of them 24/7 will kill my budget. So, I need to reduce the percentage used, on them, but no one I've talked to seems to know how to do it. If there is no way to either prevent it from using the GPU's, or limit the use I'll just have to uninstall BOINC.
ID: 1902875 · Report as offensive
Profile Jord
Volunteer tester
Avatar

Send message
Joined: 9 Jun 99
Posts: 15184
Credit: 4,362,181
RAC: 3
Netherlands
Message 1902879 - Posted: 25 Nov 2017, 15:17:20 UTC - in response to Message 1902875.  

No, there's no way to limit only GPU use. GPUs run 100% or 0%.
But that doesn't mean that you cannot set BOINC to run only when the computer is idle.
Or that you run BOINC only during times you're behind the keyboard and exit it when you stop doing so.
Via a configuration file you can even set to use only one GPU and ignore the other.

I run Seti on my GPU only in the weekends, and even then not all the time.
It won't run when I start to game. It will auto-pause when my games start.

There are enough options for that. Just ask.
ID: 1902879 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1902939 - Posted: 26 Nov 2017, 1:17:31 UTC - in response to Message 1902875.  

Or, failing Ageless' suggestions, there's still no need to uninstall BOINC just because you can't control how much your GPUs are used. You could just turn off GPU processing and use your CPUs instead. No need to throw the baby out with the bathwater if you really want to participate.
ID: 1902939 · Report as offensive
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1904106 - Posted: 1 Dec 2017, 14:52:40 UTC

None of the 2 answers are satisfactory for me, sadly, or are you talking about CPU's?

GTX 1080ti are monsters, they eat power, and burn themselves out if used to heavy over to long a time. Having 2 of these, at a cost of well over $2,000, and an electricity bill that skyrockets is just too expencive. If those who contributes to BOINC doesn't understand that, then it's just impossible for me to continue sharing my PC's, which I have done for 4 or 5 years, on this account. It's not a big blow for Seti@Home, but it might turn out to be a problem if people with expensive graphics cards starts to understand how much faster their GPU's deteriorates on such a high and constant load.
ID: 1904106 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904122 - Posted: 1 Dec 2017, 15:30:23 UTC - in response to Message 1904106.  
Last modified: 1 Dec 2017, 15:43:44 UTC

I don't understand. You're saying you don't want to pay for crunching on your GPU because it is expensive. My suggestion was to simply turn off GPU crunching and just use your CPUs. Your first statement says that my answer is unsatisfactory. I don't understand what you want then. Do you want to continue contributing to SETI@home using your CPUs or not?


Also, electronics in use don't "deteriorate". That's sadly a commonly stated myth. People have been using their GPUs on this project since GPU crunching was introduced. Yes, the GPUs produce a lot of heat and increase electricity consumption. As long as the components are properly cooled and within operating spec, in no way does crunching on a GPU make the GPU less effective any more than using your CPU for crunching makes it less effective at performing normal operations. That's simply not how modern electronics work.
ID: 1904122 · Report as offensive
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1904132 - Posted: 1 Dec 2017, 15:44:55 UTC - in response to Message 1904122.  

I don't understand. You're saying you don't want to pay for crunching on your GPU because it is expensive. My suggestion was to simply turn off GPU crunching and just use your CPUs. Your first statement says that my answer is unsatisfactory. I don't understand what you want then. Do you want to continue contributing to SETI@home using your CPUs or not?


Also, electronics in use don't "deteriorate". That's sadly a commonly stated myth. People have been using their GPUs on this project since GPU crunching was introduced. Yes, the GPUs produce a lot of heat and increase electricity consumption. But in no way does crunching on a GPU make the GPU less effective any more than using your CPU for crunching makes it less effective at performing normal operations. That's simply not how modern electronics work.


I have found nowhere, in the app, where I can turn GPU usage off. I can put it on pause, or whatever, but next morning they're running 100%.

As an electronic engineer I have to disagree with you on the deteriorating-part. GPU's and CPU's that are squeezed for all they can give (ie overclocked), will deteriorate. Of course, there are people who buy state of the art CPU's and GPU's without trying to overclock, but I'm not among them. I could turn the overclocking off, when not needed, but I have a lousy short-term memory, and a reboot (which happens when Windows update runs) will turn it back on.

The thing is that I don not understand why no one has implemented the possibility to adjust usage of the GPU in the same way as the CPU. It's like those who develop the app is living in a Unix-world of the seventies, where graphics was something that came out on the typewriter.
ID: 1904132 · Report as offensive
Profile Mr. Kevvy Crowdfunding Project Donor*Special Project $250 donor
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 15 May 99
Posts: 3776
Credit: 1,114,826,392
RAC: 3,319
Canada
Message 1904137 - Posted: 1 Dec 2017, 15:55:56 UTC - in response to Message 1904132.  
Last modified: 1 Dec 2017, 16:03:54 UTC

I have found nowhere, in the app, where I can turn GPU usage off.


Of course... this is because that isn't where it is. Click on your profile top-right, go into Preferences for this project, click Edit preferences and uncheck the Use * GPU boxes. They will never re-enable on their own.

Edit: I've also lost three video cards so far out of over twenty since I started, and all of them were under warranty and RMAed at no cost to me but shipping. This is running them at 100% capacity or as much as possible 24x7 for many years. Quality kit can stand the usage.
ID: 1904137 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904138 - Posted: 1 Dec 2017, 15:56:44 UTC - in response to Message 1904132.  
Last modified: 1 Dec 2017, 15:57:55 UTC

I have found nowhere, in the app, where I can turn GPU usage off. I can put it on pause, or whatever, but next morning they're running 100%.


Did you look in your online preferences at all?

As an electronic engineer I have to disagree with you on the deteriorating-part. GPU's and CPU's that are squeezed for all they can give (ie overclocked), will deteriorate.


While it's great you're an electronic engineer, that doesn't immediately qualify you as an expert on the subject. Do you have any case studies supporting your view, with peer-reviewed data that I can take a look at?
ID: 1904138 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904151 - Posted: 1 Dec 2017, 16:15:56 UTC - in response to Message 1904137.  

This is running them at 100% capacity or as much as possible 24x7 for many years. Quality kit can stand the usage.


* Provided adequate cooling within the manufacturer's specs.
ID: 1904151 · Report as offensive
Profile Mr. Kevvy Crowdfunding Project Donor*Special Project $250 donor
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 15 May 99
Posts: 3776
Credit: 1,114,826,392
RAC: 3,319
Canada
Message 1904154 - Posted: 1 Dec 2017, 16:25:51 UTC - in response to Message 1904151.  

* Provided adequate cooling within the manufacturer's specs.


Aye... I don't overclock. They were designed for a certain clock and temperature spec. so I defer to the expertise of the designers. RMAing is a pain so I'd rather it lasted.
ID: 1904154 · Report as offensive
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1904170 - Posted: 1 Dec 2017, 16:52:59 UTC - in response to Message 1904138.  

As an electronic engineer I have to disagree with you on the deteriorating-part. GPU's and CPU's that are squeezed for all they can give (ie overclocked), will deteriorate.


While it's great you're an electronic engineer, that doesn't immediately qualify you as an expert on the subject. Do you have any case studies supporting your view, with peer-reviewed data that I can take a look at?


Are there any case studies the other way? I follow a few overclocking forums and see this warning from people I consider experts in the field. In addition, as an engineer (since 1981), I know that all electronic equipment will deteriorate over time, and if you squeeze more out of them than the manufacturer has guaranteed, even if it's well inside the limits of temperature limits (water cooling takes care of that part) it still puts strain on it. Also, I don't expect software I install to use all possible resources without even a tiny little notice. Something like "This program will use every last bit of juice it can squeeze out of your GPU, be warned (ps. this can be adjusted in X where you can tick/adjust/whatever Y)!" (the CPU use is self explanatory, that's why Seti@Home wants our help). I have been overclocking for several years (running BOINC at the same time), but had no problem before the GTX 1080ti came out. I noticed, when the computer room was well above 30C, one morning.
ID: 1904170 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22160
Credit: 416,307,556
RAC: 380
United Kingdom
Message 1904186 - Posted: 1 Dec 2017, 18:37:06 UTC

A few comments about the use/abuse of GPUs.
All chipset designers do so (these days at least) take great care to specify the thermal conditions under which the chipset will operate "safely". If you decide to operate the chipset outside those conditions then be it on your own head and nobody elses if the chipset fails early.
Overclocking will increase power consumltion and thus operating temperature.
What you probably don't, or refuse, to understand iz the vast majority over so called "serious overclockers" are gamers and as such their GPUs only run at full load in short bursts; SETI@Home is designed to utilise GPUs to the maximum. I question why you bought such an expensive GPU then only want to run them at reduced performance while trying to push the performance to the ultimate.
A final comment, nobody within the project itself has actually asked let alone demanded you to push your power bill beyond that which you can afford.
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1904186 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 755
Credit: 5,040,916
RAC: 28
United Kingdom
Message 1904201 - Posted: 1 Dec 2017, 20:15:09 UTC

In the client config file you can turn off GPU usage:

<ignore_nvidia_dev>N</ignore_nvidia_dev>Ignore (don't use) a specific NVIDIA GPU. You can ignore more than one. Requires a client restart.Example: <ignore_nvidia_dev>0</ignore_nvidia_dev> will ignore the first NVIDIA GPU in the system.
ID: 1904201 · Report as offensive
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1904207 - Posted: 1 Dec 2017, 20:29:13 UTC - in response to Message 1904186.  

A few comments about the use/abuse of GPUs.
All chipset designers do so (these days at least) take great care to specify the thermal conditions under which the chipset will operate "safely". If you decide to operate the chipset outside those conditions then be it on your own head and nobody elses if the chipset fails early.
Overclocking will increase power consumltion and thus operating temperature.
What you probably don't, or refuse, to understand iz the vast majority over so called "serious overclockers" are gamers and as such their GPUs only run at full load in short bursts; SETI@Home is designed to utilise GPUs to the maximum. I question why you bought such an expensive GPU then only want to run them at reduced performance while trying to push the performance to the ultimate.
A final comment, nobody within the project itself has actually asked let alone demanded you to push your power bill beyond that which you can afford.


WOW! Are you actually attacking me, verbally?

I am not saying I keep within the limits of the warranty.

Yes and no, OC'ing is for gaming, but OC'ing is also for fun, to see what you can achieve. I use my iPad most of the time, the PC is for gaming, or Word/Excel/Etc. But since I wanted to contribute to a project I find interesting I thought that all the time my PC isn't in use it could be used by Seti@Home. What do you mean by "I question why you bought such an expensive GPU then only want to run them at reduced performance while trying to push the performance to the ultimate."? You final comment is kind of strange. Did I ever demand or ask that? Of course not, what I'm saying is that it is Incredibly strange that an app that is used for something as important as this, isn't better developed. In stead of getting answers like "Yeah, I hear you man!", something the majority of users (which are those Seti@Home depend on, not the few hundred that has several PC's doing nothing but Seti@Home-work) would say, I get answers (this does not include all of those who answered) from people who sounds like they never leave the campus, that haven't used a PC (unless it runs Linux, preferably an old version, without graphics) and that agreed with IBM, when they developed the PC, that graphics was of no interest, since PC's was meant for spreadsheets and word processing.
ID: 1904207 · Report as offensive
Sverre

Send message
Joined: 28 Nov 13
Posts: 8
Credit: 11,250,926
RAC: 0
Norway
Message 1904208 - Posted: 1 Dec 2017, 20:29:43 UTC - in response to Message 1904201.  

In the client config file you can turn off GPU usage:

<ignore_nvidia_dev>N</ignore_nvidia_dev>Ignore (don't use) a specific NVIDIA GPU. You can ignore more than one. Requires a client restart.Example: <ignore_nvidia_dev>0</ignore_nvidia_dev> will ignore the first NVIDIA GPU in the system.


Thanks!
ID: 1904208 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904256 - Posted: 1 Dec 2017, 23:34:53 UTC - in response to Message 1904170.  
Last modified: 1 Dec 2017, 23:38:36 UTC

Are there any case studies the other way?


So I'll take that as a no and you're getting your information from unreliable sources.

I follow a few overclocking forums and see this warning from people I consider experts in the field.


Ah, I see I was right. People on the internet, even enthusiasts and hobbyists, are not considered experts in the field. Sounds like you've put the bar too low.

In addition, as an engineer (since 1981), I know that all electronic equipment will deteriorate over time, and if you squeeze more out of them than the manufacturer has guaranteed, even if it's well inside the limits of temperature limits (water cooling takes care of that part) it still puts strain on it.


That's absolutely and irrevocably not true. People right here at this project alone have been using CPUs since 1999 and GPUs since 2008/2009 and the only time something has gone wrong is when something got too hot and cooked itself. No one's system have deteriorated over time. My original AMD K6 233MHz CPU; Pentium 233MMX CPU; and various other CPUs I used to crunch on for years (but no longer do because I can't afford the electricity) - all of them still perform the same as when I first bought them. These CPUs ran at 100% for almost 10 years before they were retired.

Also, I don't expect software I install to use all possible resources without even a tiny little notice. Something like "This program will use every last bit of juice it can squeeze out of your GPU, be warned (ps. this can be adjusted in X where you can tick/adjust/whatever Y)!" (the CPU use is self explanatory, that's why Seti@Home wants our help).


That would be unnecesssarily alarmist and unhelpful. And you should have expected SETI@home to consume all possible resources when you install it. That's what is has always touted as doing. Because it is so good at consuming all resources, many poeple using for stress testing. Some even use it to test for solid overclocks.

I have been overclocking for several years (running BOINC at the same time), but had no problem before the GTX 1080ti came out. I noticed, when the computer room was well above 30C, one morning.


Well yeah, overclocked 1080ti's are going to have problems considering how hot they run at base speeds.

By the way, my first overclock was running a 486SX 25MHz CPU at 33MHz back in 1992. I'm on an internet forum and I'm a hobbyist. Why am I not considered an expert in the field according to your standard?
ID: 1904256 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904257 - Posted: 1 Dec 2017, 23:37:27 UTC - in response to Message 1904208.  

In the client config file you can turn off GPU usage:

<ignore_nvidia_dev>N</ignore_nvidia_dev>Ignore (don't use) a specific NVIDIA GPU. You can ignore more than one. Requires a client restart.Example: <ignore_nvidia_dev>0</ignore_nvidia_dev> will ignore the first NVIDIA GPU in the system.


Thanks!


Except this is the hard way of turning off GPU usage. Most everyone else uses the project preferences and simply unchecks all the GPU types, click save at the bottom, then go into the BOINC app and perform an update. Note that this would need to be done at every project you use (if applicable).
ID: 1904257 · Report as offensive
OzzFan Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Apr 02
Posts: 15691
Credit: 84,761,841
RAC: 28
United States
Message 1904263 - Posted: 1 Dec 2017, 23:49:09 UTC - in response to Message 1904207.  

You final comment is kind of strange. Did I ever demand or ask that? Of course not, what I'm saying is that it is Incredibly strange that an app that is used for something as important as this, isn't better developed.


The app is developed exactly as it is supposed to be. Don't you think that if you were on the right side of the technical argument, that this would have been resolved long ago after people complaining? Do you really think that you're the first to challenge the resource consumption of SETI@home? Don't you think that if there was any merit to your argument that the developers would have changed the code long ago?

In stead of getting answers like "Yeah, I hear you man!",


The fact that you are not getting those types of responses should tell you something. I know it's a typical human reaction when met with opposition to double-down and stand your ground, and to cite off credentials of yourself and others to reassure your own credibility, but there's an entire sub-forum called Number Crunching that is dedicated to getting the most out of your system's crunching performance, including overclocked and using multiple-GPUs, and how to keep it all cooled.

If you want a forum of experts (and I use that term respectfully but loosely), there you have it. I'm sure if systems actually deteriorated in performance, someone would have discovered it long ago and done something about it.

I know it runs contrary to a belief you hold - and you can still disable your GPU if you wish, no one's going to stop you - but you are wrong about the deterioration aspect. That's plenty of evidence in that number crunchinng forum to support what I'm saying.
ID: 1904263 · Report as offensive

Questions and Answers : Windows : How do I control usage on GPU's?


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.