raspberry pi 3 vs GPU, whats best?

Message boards : Number crunching : raspberry pi 3 vs GPU, whats best?
Message board moderation

To post messages, you must log in.

1 · 2 · Next

AuthorMessage
Anders Kihle

Send message
Joined: 27 Mar 16
Posts: 4
Credit: 162,948
RAC: 0
Norway
Message 1785966 - Posted: 8 May 2016, 22:39:52 UTC

Hello, always want to help in rsearch and use computer power.
Now i want to uppgrade to somthing better.
Like a Clouster ore GPU`s RIG.

But i dont know to much about this.

So whats best to use if you think price and power output/cost:
raspberry pi 3 ore GPU`S?

Thanks fore any answere :)
ID: 1785966 · Report as offensive
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22202
Credit: 416,307,556
RAC: 380
United Kingdom
Message 1786008 - Posted: 9 May 2016, 2:31:31 UTC

A raspberry pi cluster wins on the "fun" side, but, given that a GTX750 will complete a task in about half an hour compared to the day or more for a single pi it would have to be a large cluster to get anywhere near that sort of throughput.
Cost wise a GTX750 casts about 100gbp, compared to the 30gbp for pi, and in energy terms the GTX750 uses about 50watts, and each pi about 4watts (one source I saw gave a figure of 12watts)
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 1786008 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1786051 - Posted: 9 May 2016, 5:58:26 UTC
Last modified: 9 May 2016, 6:00:26 UTC

I've starting calculating watt hours per task to determine which of devices are the most efficient.
Which can be calculated two different ways.
1. daily watt hours/daily number of tasks
or
2. ((watts * run time in minutes)/60)/number of concurrent tasks

For the Raspberry Pi 3 I found someone stating it took ~25 hours to complete tasks. They didn't mention how many tasks they were running at once but it is easy to calculate for each possibility.
Using method #2
If running 1 task (4w * 1500min)/60)/1 = 100Wh per MB task
If running 2 task (4w * 1500min)/60)/2 = 50Wh per MB task
If running 3 task (4w * 1500min)/60)/3 = 33.3Wh per MB task
If running 4 task (4w * 1500min)/60)/4 = 25Wh per MB task

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).
Running 2 task (45w * 25min)/60)/2 = 9.375Wh per MB task
Even if I use the EVGA's TDP value for power usage
Running 2 task (85w * 25min)/60)/2 = 17.7Wh per MB task

For comparison some of my systems
i5-4670K with a TDP of 84W running 4 MB tasks at once in ~1h each.
(84w * 60min)/60)/4 = 21Wh per MB task
Celeron J1900 with a TDP of 10W running 4 MB tasks at once in ~6h each.
(10w * 360min)/60)/4 = 15Wh per MB task

This tells me. Even if the Raspberry Pi 3 is running 4 tasks at once in ~25 hours it is less efficient than an i5-4670K. Also that my Celeron J1900 is more efficient than I thought it was.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1786051 · Report as offensive
Sidewinder Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 15 Nov 09
Posts: 100
Credit: 79,432,465
RAC: 0
United States
Message 1786069 - Posted: 9 May 2016, 7:01:01 UTC

Rob and HAL are pretty much spot on. I see similar numbers with my crunchers vs my Pi's. I think the main issue is that the Pi is meant to be a low-cost, low power generic computer; not really specialized for this type of work. I bet there are some ARM designs that are optimized for computation (probably why researchers would even consider building large ARM clusters), but they may not be cheap and/or publicly available.

There are some alternatives that do crunch faster than a Pi, but they cost more and use more power (examples, here and here). They also have their own issues with OSs/support/community. Pi does have a good community and wide use.

One positive of the Pi though is the cost of cooling. They can be cooled passively for the most part which could save you money on cooling (my house gets quite hot from my dedicated crunchers).
ID: 1786069 · Report as offensive
Profile jason_gee
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 24 Nov 06
Posts: 7489
Credit: 91,093,184
RAC: 0
Australia
Message 1786072 - Posted: 9 May 2016, 7:15:26 UTC - in response to Message 1786069.  

A part of it might be a function of the architecture. From what I understand of the ARM architecture (which is limited), the memory subsystem, IO and cache orient themselves better to running one multithreaded process better than multiple single threaded ones. Probably future applications will take better advantage of what's available, at least once some general linuxey things changing lately settle down a bit.
"Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions.
ID: 1786072 · Report as offensive
Anders Kihle

Send message
Joined: 27 Mar 16
Posts: 4
Credit: 162,948
RAC: 0
Norway
Message 1786238 - Posted: 9 May 2016, 19:14:59 UTC

Thank you all fore NICE answer.
ps:Sorry my ENGLISH!!!
ID: 1786238 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1786263 - Posted: 9 May 2016, 20:16:26 UTC - in response to Message 1786238.  

Thank you all fore NICE answer.
ps:Sorry my ENGLISH!!!

We try to help as best we can.
Your English is much better than my Norwegian.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1786263 · Report as offensive
Anders Kihle

Send message
Joined: 27 Mar 16
Posts: 4
Credit: 162,948
RAC: 0
Norway
Message 1786271 - Posted: 9 May 2016, 20:59:15 UTC - in response to Message 1786263.  

;) hehe
ID: 1786271 · Report as offensive
OTS
Volunteer tester

Send message
Joined: 6 Jan 08
Posts: 369
Credit: 20,533,537
RAC: 0
United States
Message 1786290 - Posted: 9 May 2016, 22:30:12 UTC - in response to Message 1786051.  

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).


Interesting. It appears nvidia-smi is not very accurate, at least on watts on consumed. Nvidia-smi reports my 750ti is using only 28 watts but after reading what yours is drawing I checked it with both the watts reading on the UPS and a KillaWatt device. It is a little difficult to know for sure because as soon as you kill the GPU processes they restart, but both the Killawatt and the UPS report mine is drawing about 40 watts for two tasks which is in line with what you were saying.

I wonder why the discrepancy between these two devices and nvidia-smi and if the nvidia-smi reported 95-99% utilization and temperature of 49C has any validity.
ID: 1786290 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1786311 - Posted: 10 May 2016, 0:04:27 UTC - in response to Message 1786290.  

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).


Interesting. It appears nvidia-smi is not very accurate, at least on watts on consumed. Nvidia-smi reports my 750ti is using only 28 watts but after reading what yours is drawing I checked it with both the watts reading on the UPS and a KillaWatt device. It is a little difficult to know for sure because as soon as you kill the GPU processes they restart, but both the Killawatt and the UPS report mine is drawing about 40 watts for two tasks which is in line with what you were saying.

I wonder why the discrepancy between these two devices and nvidia-smi and if the nvidia-smi reported 95-99% utilization and temperature of 49C has any validity.

I have noticed the GPU power consumption does seem to vary based on the tasks that are being processed. I just checked my system. GPUz & HWinfo are both telling me my 750 is running about 33w right now.
Normally we figure about 80% of the TDP for power consumption while running SETI@home tasks. Perhaps consuming 25-45w is to be expected while running two tasks on a 750ti.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1786311 · Report as offensive
OTS
Volunteer tester

Send message
Joined: 6 Jan 08
Posts: 369
Credit: 20,533,537
RAC: 0
United States
Message 1786322 - Posted: 10 May 2016, 1:41:19 UTC - in response to Message 1786311.  

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).


Interesting. It appears nvidia-smi is not very accurate, at least on watts on consumed. Nvidia-smi reports my 750ti is using only 28 watts but after reading what yours is drawing I checked it with both the watts reading on the UPS and a KillaWatt device. It is a little difficult to know for sure because as soon as you kill the GPU processes they restart, but both the Killawatt and the UPS report mine is drawing about 40 watts for two tasks which is in line with what you were saying.

I wonder why the discrepancy between these two devices and nvidia-smi and if the nvidia-smi reported 95-99% utilization and temperature of 49C has any validity.

I have noticed the GPU power consumption does seem to vary based on the tasks that are being processed. I just checked my system. GPUz & HWinfo are both telling me my 750 is running about 33w right now.
Normally we figure about 80% of the TDP for power consumption while running SETI@home tasks. Perhaps consuming 25-45w is to be expected while running two tasks on a 750ti.


That makes sense, but I was wondering why nvidia-smi was reporting values so much less than the actual measured values with two independent separate hardware devices and if the temps and utilization figures as reported by nvidia-smi were off as well. I have never seen nvidia-smi report anywhere close to 35 watts let alone 40 watts.
ID: 1786322 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1786325 - Posted: 10 May 2016, 1:52:11 UTC - in response to Message 1786322.  

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).


Interesting. It appears nvidia-smi is not very accurate, at least on watts on consumed. Nvidia-smi reports my 750ti is using only 28 watts but after reading what yours is drawing I checked it with both the watts reading on the UPS and a KillaWatt device. It is a little difficult to know for sure because as soon as you kill the GPU processes they restart, but both the Killawatt and the UPS report mine is drawing about 40 watts for two tasks which is in line with what you were saying.

I wonder why the discrepancy between these two devices and nvidia-smi and if the nvidia-smi reported 95-99% utilization and temperature of 49C has any validity.

I have noticed the GPU power consumption does seem to vary based on the tasks that are being processed. I just checked my system. GPUz & HWinfo are both telling me my 750 is running about 33w right now.
Normally we figure about 80% of the TDP for power consumption while running SETI@home tasks. Perhaps consuming 25-45w is to be expected while running two tasks on a 750ti.


That makes sense, but I was wondering why nvidia-smi was reporting values so much less than the actual measured values with two independent separate hardware devices and if the temps and utilization figures as reported by nvidia-smi were off as well. I have never seen nvidia-smi report anywhere close to 35 watts let alone 40 watts.

Perhaps that tool is only reporting power usage from the PCIe bus or maybe the PCIe power connector?
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1786325 · Report as offensive
OTS
Volunteer tester

Send message
Joined: 6 Jan 08
Posts: 369
Credit: 20,533,537
RAC: 0
United States
Message 1786327 - Posted: 10 May 2016, 2:11:48 UTC - in response to Message 1786325.  

Comparing that to my GTX 750ti FTW running 2 tasks at once in ~25 min & drawing up to 45w (according to GPUz).


Interesting. It appears nvidia-smi is not very accurate, at least on watts on consumed. Nvidia-smi reports my 750ti is using only 28 watts but after reading what yours is drawing I checked it with both the watts reading on the UPS and a KillaWatt device. It is a little difficult to know for sure because as soon as you kill the GPU processes they restart, but both the Killawatt and the UPS report mine is drawing about 40 watts for two tasks which is in line with what you were saying.

I wonder why the discrepancy between these two devices and nvidia-smi and if the nvidia-smi reported 95-99% utilization and temperature of 49C has any validity.

I have noticed the GPU power consumption does seem to vary based on the tasks that are being processed. I just checked my system. GPUz & HWinfo are both telling me my 750 is running about 33w right now.
Normally we figure about 80% of the TDP for power consumption while running SETI@home tasks. Perhaps consuming 25-45w is to be expected while running two tasks on a 750ti.


That makes sense, but I was wondering why nvidia-smi was reporting values so much less than the actual measured values with two independent separate hardware devices and if the temps and utilization figures as reported by nvidia-smi were off as well. I have never seen nvidia-smi report anywhere close to 35 watts let alone 40 watts.

Perhaps that tool is only reporting power usage from the PCIe bus or maybe the PCIe power connector?


Now that you mention it, I forgot about the fact when the GPU is working it also adds load to the CPU and nvidia-smi almost for certain does not show that load. Even if it doesn't report accurately, you can use it a a comparative tool which is the way I do use it. I guess we have beaten this horse into the ground.
ID: 1786327 · Report as offensive
Profile Mark Wyzenbeek
Avatar

Send message
Joined: 28 Jun 99
Posts: 134
Credit: 6,203,079
RAC: 0
United States
Message 1786340 - Posted: 10 May 2016, 4:09:04 UTC

Could power supply efficiency be the difference? The difference being lost as heat in the power supply.
The Universe is not only stranger than you imagine, it's stranger than you can imagine.

SETI@home classic workunits 1,405 CPU time 57,318 hours
ID: 1786340 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1786383 - Posted: 10 May 2016, 9:56:50 UTC - in response to Message 1786327.  
Last modified: 10 May 2016, 10:25:54 UTC

That makes sense, but I was wondering why nvidia-smi was reporting values so much less than the actual measured values with two independent separate hardware devices and if the temps and utilization figures as reported by nvidia-smi were off as well. I have never seen nvidia-smi report anywhere close to 35 watts let alone 40 watts.

Perhaps that tool is only reporting power usage from the PCIe bus or maybe the PCIe power connector?

Now that you mention it, I forgot about the fact when the GPU is working it also adds load to the CPU and nvidia-smi almost for certain does not show that load. Even if it doesn't report accurately, you can use it a a comparative tool which is the way I do use it. I guess we have beaten this horse into the ground.

I found where nvidia-smi was hiding & the value it displays for power usage matches what I see in other monitoring software. I expect they are reading from the same source. My 750ti seems to want to run at 1345MHz. Perhaps that is related to the higher power usage I see?

Even if I have abnormally high power usage for a 750ti it is still more efficient than several Rasberry Pi's & more cost effective.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1786383 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13736
Credit: 208,696,464
RAC: 304
Australia
Message 1786385 - Posted: 10 May 2016, 10:40:23 UTC - in response to Message 1786383.  

According to GPU-Z my main GTX 750Ti is generally around 65% of it's maximum TDP (60W) which works out to be around 40W.
Running 2 WUs at a time with approx 90% GPU load at 1280.3MHz
Grant
Darwin NT
ID: 1786385 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1786389 - Posted: 10 May 2016, 11:10:26 UTC - in response to Message 1786385.  

According to GPU-Z my main GTX 750Ti is generally around 65% of it's maximum TDP (60W) which works out to be around 40W.
Running 2 WUs at a time with approx 90% GPU load at 1280.3MHz

That seems to match mine. I have one host running with a kill-a-watt style meter, with different projects running on each of the major components: GTX 970, GTX 750Ti, Intel HD 4600, CPU - so I can see the power differential for each device by suspending the corresponding project.

SETI runs on the 750Ti, two MB tasks. The wall power draw dropped by around 40W - 272W to 232W - when I suspended SETI. GPU-Z says it's drawing around 60%-65% TDP (fluctuating as I type), at a core clock of 1319.8 MHz. It's a "Golden Sample" card, running at factory settings - and on PCIe power only, no additional power cable required.
ID: 1786389 · Report as offensive
Profile EEVblog
Avatar

Send message
Joined: 20 Apr 16
Posts: 20
Credit: 4,351,842
RAC: 0
Australia
Message 1786400 - Posted: 10 May 2016, 12:16:30 UTC - in response to Message 1786008.  


and in energy terms the GTX750 uses about 50watts, and each pi about 4watts (one source I saw gave a figure of 12watts)


My RPi2 only takes in the order of 2.5W running all 4 cores at 100%
https://youtu.be/pcQaseUJeZI?t=22m24s
[/quote]
ID: 1786400 · Report as offensive
Profile EEVblog
Avatar

Send message
Joined: 20 Apr 16
Posts: 20
Credit: 4,351,842
RAC: 0
Australia
Message 1786402 - Posted: 10 May 2016, 12:21:42 UTC - in response to Message 1786069.  


There are some alternatives that do crunch faster than a Pi, but they cost more and use more power


I'm looking at trying an array of Orange Pi One's:
http://www.aliexpress.com/store/product/Orange-Pi-One-ubuntu-linux-and-android-mini-PC-Beyond-and-Compatible-with-Raspberry-Pi-2/1553371_32603308880.html

Only $10 for a 4 core 1.2GHz ARM board compatible with the RPi2
ID: 1786402 · Report as offensive
Profile EEVblog
Avatar

Send message
Joined: 20 Apr 16
Posts: 20
Credit: 4,351,842
RAC: 0
Australia
Message 1786404 - Posted: 10 May 2016, 12:24:30 UTC - in response to Message 1786051.  
Last modified: 10 May 2016, 12:27:30 UTC


For the Raspberry Pi 3 I found someone stating it took ~25 hours to complete tasks. They didn't mention how many tasks they were running at once but it is easy to calculate for each possibility.


Here is data from my RPi2 running 4 cores at 100%
180,000 seconds average per task, and as stated, 2.5W for 4 cores, so 0.625W per task.


ID: 1786404 · Report as offensive
1 · 2 · Next

Message boards : Number crunching : raspberry pi 3 vs GPU, whats best?


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.