Questions and Answers :
Windows :
Equipment safety
Message board moderation
Author | Message |
---|---|
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Hello, I just started a few minutes ago and am running my CPU at 100%, GPU looks like it's at >70%. I'm running i7 2600K @ stock clock w/ h100i cooling and GTX 550ti vid card. fractal design case with 2 intake fans and the h100i fans for exit. Looks like after 16 minutes, CPU temps are on 60*C, and GPU temp is at 71*C. Should I throttle down CPU and GPU usage to keep my rig from blowing up? I would like to keep this rig (minus the GPU) for as long as I can. Thanks. |
rob smith Send message Joined: 7 Mar 03 Posts: 22218 Credit: 416,307,556 RAC: 380 |
Your rig is comfortably inside its temperature limits so don't bother about throttling - mine have been sitting at those sort of temperatures for weeks - indeed one of my GPUs has been at that sort of temperature for well over a year and no harm. Bob Smith Member of Seti PIPPS (Pluto is a Planet Protest Society) Somewhere in the (un)known Universe? |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Thanks for the quick reply. Instead of creating another thread, I figured I'd ask here... I have 19 SETI@home with "Ready to report" status. Will they upload by themselves? I have 9 currently "Running" status and a lot of "Ready to start" statuses. Any more inputs/action needed from me? Thanks again. |
Jord Send message Joined: 9 Jun 99 Posts: 15184 Credit: 4,362,181 RAC: 3 |
I have 19 SETI@home with "Ready to report" status. Will they upload by themselves? These are already uploaded. The Upload & report system is twofold. Uploading is just moving data from a hard drive on your computer, via the internet to a hard drive at the server at Seti. Reporting uses the database, and it's been found that reporting e.g. 25 tasks takes as much overhead as it does to report 1 task. With overhead, we mean server CPU, memory and disk-usage. So it's better to report multiple. Which is what BOINC tries to do. Completed work is reported at the first of: 1) 24 hours before deadline. 2) Connect Every X before deadline. 3) 24 hours after task completion. 4) Immediately if the upload completes later than either 1, 2, or 3 upon completion of the task. 5) On a trickle up message. 6) On a trickle down request. 7) On a server scheduled connection. Used, but I am not certain by which project. 8) On a request for new work. 9) When the user pushes the update button. 10) On a request from an account manager. 11) Report immediately every task, if "No new Task" is set. 12) Report immediately if CPU or network time-of-day override starts in the next 30 minutes. (BOINC 7.0) 13) When minimum work buffer is reached. (BOINC 7.0) I have 9 currently "Running" status and a lot of "Ready to start" statuses. No, BOINC is essentially a set and forget software. As long as you run stock software (no third-party science applications) and have set BOINC to run on preferences or always, it'll do that (as long as the project has work and no server trouble). It doesn't need any manual input from the human, it works even better without human intervention. :-) |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Awesome. One last question... Maybe :) Multi-processor rig (used/auction server) without video card or Desktop rig (i5) with a vid card (gtx 660)? |
OzzFan Send message Joined: 9 Apr 02 Posts: 15691 Credit: 84,761,841 RAC: 28 |
Desktop rig with vid card. GPUs are far more efficient at SETI work than any CPU out there. |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Interesting. I guess I'm investing in a few then. Thanks! |
Jord Send message Joined: 9 Jun 99 Posts: 15184 Credit: 4,362,181 RAC: 3 |
GPUs are far more efficient at SETI work than any CPU out there. LOL, is not what Eric said: 29 21/08/2013 1:38:30 PM NVIDIA GPU 0: GeForce GTX 780 (driver version 32641, CUDA version 5050, compute capability 3.5, 3072MB, 4698 GFLOPS peak) ;-) |
soft^spirit Send message Joined: 18 May 99 Posts: 6497 Credit: 34,134,168 RAC: 0 |
Look at the top machines. Also realize people have done the math of credit per watt. GPU wins. Janice |
OzzFan Send message Joined: 9 Apr 02 Posts: 15691 Credit: 84,761,841 RAC: 28 |
GPUs are far more efficient at SETI work than any CPU out there. Actually, nowhere in there did Eric say that GPUs weren't more efficient than CPUs. All he said was the GPUs aren't as efficient at the manufacturer's GFLOPS claims. |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Seems like my 550Ti may have died. Only a few hours at 70*c. See how much points I've accumulated. That's all I've done and my 550Ti gave up. I'll take a closer look tomorrow, perhaps, too tired (or lazy) from work. |
Jord Send message Joined: 9 Jun 99 Posts: 15184 Credit: 4,362,181 RAC: 3 |
Actually, nowhere in there did Eric say that GPUs weren't more efficient than CPUs. All he said was the GPUs aren't as efficient at the manufacturer's GFLOPS claims. Earlier: For many projects there seems to be no relationship between credit granted and work done, especially GPU-only projects. CUDA5 GPUs are about 2.5% efficient when running SETI@home, and that's actually not that bad for GPU code. If we assumed GPUs were 100% efficient, and had a GPU-only app we could boost your credit rate by 40X. But I prefer that they mean something. :-) |
OzzFan Send message Joined: 9 Apr 02 Posts: 15691 Credit: 84,761,841 RAC: 28 |
Actually, nowhere in there did Eric say that GPUs weren't more efficient than CPUs. All he said was the GPUs aren't as efficient at the manufacturer's GFLOPS claims. Again, all that says is that the hardware is only about 2.5% as efficient as the GFLOPS rating suggests, not that they are only 2.5% efficient in total. A GPU at 2.5% efficiency is still faster than any CPU out there due to the way GPUs are able to handle math. If Eric was claiming as you suggest, and if your argument is true that GPUs aren't as efficient as CPUs, then the entire top 100 list would be dominated by Quad-socket servers with 16 core beasts in them. But that's not what Eric was claiming. |
Jord Send message Joined: 9 Jun 99 Posts: 15184 Credit: 4,362,181 RAC: 3 |
Again, all that says is that the hardware is only about 2.5% as efficient as the GFLOPS rating suggests, not that they are only 2.5% efficient in total. A GPU at 2.5% efficiency is still faster than any CPU out there due to the way GPUs are able to handle math. Even at 2.5% efficiency, GPUs would be better than CPUs. Because of their being faster, having a crap-load more cores to hack at the data simultaneously, way faster memory. But if the GPUs would be even better, the 100% efficiency, only then would the credit increase 40 times. At the moment the op-code used to port the science application to run on the GPU is not efficient, it does not use all the GPUs their (almost unique) hardware to the fullest, thus is not as efficient as the CPU is, which is used to its fullest capability/capacity. If you had a CPU that had 1536 cores, it would be as good as any of the GPUs in the middle/higher end if all cores were used simultaneously. It's just that a CPU has 1, 2, 3, 4, maybe 6, maybe 8, and top maybe 12 cores. Hyper-threading excluded. And they don't use these cores through a multi-threaded application. Perhaps if Seti had an mt application, there'd be a better comparison. So now it's just always 1 core per task on CPU versus all cores per task on GPU, which is why the GPUs are so much the better. Doesn't mean that the actual work they do is more efficient, rated on a 100% by 100% comparison, CPU vs GPU. So in this case it's CPU = 100% efficient per single core, versus GPU = 2.5% efficient per e.g. 1536 cores. Nothing to do in the above (second) example with the GFLOP value that the manufacturer gave its GPU. |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
Not that any of you seem to have cared, but reseating the vid card seemed to have revived it. Need to do spring cleaning on my rig anyway... For what its worth, y'all's arguing is giving me an idea on how everything works. If what y'all are claiming is true, has anyone been able to get a PS3 to work? I know some entities are using PS3s to crunch numbers for them. |
OzzFan Send message Joined: 9 Apr 02 Posts: 15691 Credit: 84,761,841 RAC: 28 |
If you wanna be pedantic about it: Even at 2.5% efficiency, GPUs would be better than CPUs. Because of their being faster, having a crap-load more cores to hack at the data simultaneously, way faster memory. GPUs are typically run no more than 1GHz where as CPUs are run anywhere from 1.6 to 4GHz+, but I know you didn't mean literal speed when saying GPUs are faster. GPUs may have more cores, but they're not directly comparable to CPU cores. They are actually highly specialized units that are very efficient at repetitive routines and functions. But if the GPUs would be even better, the 100% efficiency, only then would the credit increase 40 times. At the moment the op-code used to port the science application to run on the GPU is not efficient, it does not use all the GPUs their (almost unique) hardware to the fullest, thus is not as efficient as the CPU is, which is used to its fullest capability/capacity. Ok, obviously I have to clarify myself. When I said that GPUs are far more efficient, I meant that they were far more capable of doing the work when compared to CPUs, not a literal efficiency rating. If you had a CPU that had 1536 cores, it would be as good as any of the GPUs in the middle/higher end if all cores were used simultaneously. It's just that a CPU has 1, 2, 3, 4, maybe 6, maybe 8, and top maybe 12 cores. Not even close. The cores of a CPU are designed to run a wide variety of code whereas the cores of a GPU, as I said previously, are highly specialized math engines - imagine an FPU on crack. 1536 CPU cores would still not be able to match the performance of 1536 GPU cores even at a 2.5% efficiency rating. Doesn't mean that the actual work they do is more efficient, rated on a 100% by 100% comparison, CPU vs GPU. Actually, it does have to do with the manufacturer given GPU GFLOP value. The manufacturer's GFLOP value is the maximum theoretical performance of the card assuming a 100% efficiency rating. The fact that no card will ever come close to 100% efficiency because of a multitude of other factors of card design is evident that many GPU applications don't actually reach that value. However, even at a relatively low efficiency rating of 2.5%, the fact that a GPGPU core is highly specialized still gives it the advantage. Try not to be so literal next time. There really was no need to confuse the newbies asking for help in Q&A about pedantic details when asking for help. |
OzzFan Send message Joined: 9 Apr 02 Posts: 15691 Credit: 84,761,841 RAC: 28 |
If what y'all are claiming is true, has anyone been able to get a PS3 to work? I know some entities are using PS3s to crunch numbers for them. Yes, that has been tried. The caveat is that you must load a compatible OS onto the PS3 to crunch, and Sony has disabled the ability to run alternative OSes on the PS3 with one of their later firmware upgrades. So unless you have a PS3 that is blocked from getting any of those more recent updates, the PS3 was a very powerful cruncher. Before the firmware upgrade, people would setup entire farms of PS3s and would get some pretty impressive RAC - on Folding@home. The SETI@home/BOINC client wasn't mature enough to get the same kind of performance. |
soft^spirit Send message Joined: 18 May 99 Posts: 6497 Credit: 34,134,168 RAC: 0 |
SETI@home will give your graphics card and CPU a pretty good workout. It should be able to handle it assuming your power supply is up to the job. You WILL be burning more electricity than you are accustomed. But you can also have some fun doing it. And yes keeping things clean especially on air cooled rigs is a great idea. But 70c should not be an issue. 80c is a good time to clean things up.. and at 90c PANIC. Note: this is my personal rule of thumb and does not represent the views of SETI staff, nor Nvidia nor any manufacturer. Advice is guaranteed to be worth what you paid for it. Janice |
Rei Umali Send message Joined: 23 Aug 13 Posts: 7 Credit: 163,008 RAC: 0 |
My processor is liquid cooled with h100i. Vid card is air cooled. No idea why I had to reseat the vid card other than dust accumulating. Power supply should be enough, I believe 650w. |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.