Message boards :
Number crunching :
GPU Wars 2016:Â Â Pascal vs Polaris
Message board moderation
Previous · 1 · 2 · 3
Author | Message |
---|---|
KLiK Send message Joined: 31 Mar 14 Posts: 1304 Credit: 22,994,597 RAC: 60 |
It is not the 1050 fan making noise,it is the main fan of my HP Pavilion 500-152ea Desktop. For some reason, the GTX 1050 is using more power than the GTX 750 OC it had before. Maybe your card has a smaller fan, so that might be the reason?! Mine 1050ti has a 90mm fan, so it's quiet enough... ;) non-profit org. Play4Life in Zagreb, Croatia, EU |
tullio Send message Joined: 9 Apr 04 Posts: 8797 Credit: 2,930,782 RAC: 1 |
Temperature of my GPU is always 50 C or lower, and its fan is at 30%. It is not the GPU fan making noise, but the real panel fan. Tullio |
jason_gee Send message Joined: 24 Nov 06 Posts: 7489 Credit: 91,093,184 RAC: 0 |
True that. Since I installed the 1050ti, I can hear the Corsair PSU and Noctua CPU fan.... also mice crawling in the walls :/ "Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions. |
tullio Send message Joined: 9 Apr 04 Posts: 8797 Credit: 2,930,782 RAC: 1 |
My old SUN workstation has an open front panel like a car radiator and never overheats. I hear no fans Most modern PCs have a close front panel and only side windows, and they overheat. Tullio |
tullio Send message Joined: 9 Apr 04 Posts: 8797 Credit: 2,930,782 RAC: 1 |
I took off the side panel of the PC and things seem to be better. Maybe it was overheating since I installed the GTX 1050 which seems to use more power than the GTX 750 OC it replaced. Tullio |
AMDave Send message Joined: 9 Mar 01 Posts: 234 Credit: 11,671,730 RAC: 0 |
♦  AMD Vega handles Doom like a champ
|
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13736 Credit: 208,696,464 RAC: 304 |
[quote]♦  AMD Vega handles Doom like a champ
Grant Darwin NT |
Dr Grey Send message Joined: 27 May 99 Posts: 154 Credit: 104,147,344 RAC: 21 |
Hopefully that will be putting some price pressure on the 1080 Ti launch |
AMDave Send message Joined: 9 Mar 01 Posts: 234 Credit: 11,671,730 RAC: 0 |
|
Al Send message Joined: 3 Apr 99 Posts: 1682 Credit: 477,343,364 RAC: 482 |
Interesting, thanks for the link. Well, that makes me feel pretty good about my upcoming (eventual) farm additions, because they are all running at PCI-E 1.1 x16, so it looks like I can run pretty much any GPU and not have the bus getting seriously in the way. Now, I will be running the fastest CPU that the boards can take to feed them, but it isn't nearly as bleak as I had been fearing. Good news, thanks again! |
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13736 Credit: 208,696,464 RAC: 304 |
Interesting, thanks for the link. Well, that makes me feel pretty good about my upcoming (eventual) farm additions, because they are all running at PCI-E 1.1 x16, so it looks like I can run pretty much any GPU and not have the bus getting seriously in the way. Now, I will be running the fastest CPU that the boards can take to feed them, but it isn't nearly as bleak as I had been fearing. Good news, thanks again! Keep in mind all of those benchmarks are for video games, there aren't any computing benchmarks there. PCIe v1 *16 certainly won't be an issue, but running high performance cars in PCIe v2 *1 slots maybe. My GTX 1070s running SoG with aggressive settings have Bus Interface peaks of up to 20% (generally it's around 14-17%) on a PCIe v2.0 *16 Bus running at *8. Whether it's 20% of *16 speed, or 20% of *8 speed, I don't know. CUDA50 often gave 0%, with the odd 1-2% blip. I suspect Tbars Lunix CUDA special has a much higher bus load; the more work the GPU does, the more it has to communicate with the CPU. So 20% of PCIe v2 speeds gives. *1 100MB/s *4 400MB/s *8 800MB/s *16 1.6GB/s Even hoping for the best (my 20% Bus Load is at the *8 speed ie 800MB/s), PCIe v2 *1 (500MB/s would impact on the throughput of my GTX 1070. Whether that would result in significantly longer crunch times and/or increased CPU usage I've no idea. The only way to find out is for someone to do the tests. Ideally the most extreme load (a Pascal TitanX), or even just a GTX1080. Run it with the most aggressive settings possible in a *8 or *16 PCIe v2 or v3 slot & monitor the run times & Bus Load and CPU load. Then run it in a PCIe v2 *1 slot (or even PCIe v1 *1 if possible) with the same settings & compare run times & Bus usage & CPU load for equivalent runtime WUs. My personal wild arse guess is that for mid range cards the effect on processing times will be minimal, but for the highest of the high end it will make a significant increase in run times (although the absolute runtimes will still be impressive). The more powerful the hardware, and the more efficient (ie faster) the application, the greater the effect of any bottlenecks. PCIe Throughput Version x1 x4 x8 x16 1.0 250MB/s 1GB/s 2GB/s 4GB/s 2.0 500MB/s 2GB/s 4GB/s 8GB/s 3.0 984.6GB/s 3.938GB/s 7.877GB/s 15.754GB/s 4.0 1.969GB/s 7.877GB/s 15.754GB/s 31.508GB/s Grant Darwin NT |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.