Posts by evilspoons

1) Questions and Answers : Windows : SLI graphics card connector question. (Message 1326413)
Posted 10 Jan 2013 by Profile evilspoons
Post:
What is probably happening is that each GPU task is allocated a little bit of a CPU (the GPUs need to be fed data from the CPU). When you have two GPU tasks running, this uses enough of a CPU to make BOINC decide to halt another CPU task altogether so that the GPU tasks may run properly.
2) Message boards : Number crunching : Going from GTX 470 to 500/600 series what model is best? (Message 1326379)
Posted 10 Jan 2013 by Profile evilspoons
Post:

Starting test: (x41zc)
21 December 2012 - 03:40:31 Start, devices: 3, device count: 3 (0.33)
--------------------------------------------------------------------------- Results:
Device: 0, device count: 3, average time / count: 358, average time on device: 119 Seconds (1 Minutes, 59 Seconds)
Device: 1, device count: 3, average time / count: 357, average time on device: 119 Seconds (1 Minutes, 59 Seconds)
Device: 2, device count: 3, average time / count: 306, average time on device: 102 Seconds (1 Minutes, 42 Seconds)


Very nice times. I'd be very interested to see times from a 680. If anyone can post times with the x41zc application I'd appreciate it.


I have a GTX 680 at home and I'm running x41zc, but I'm not sure how to run this benchmark. Anyone?

--

By the way, PCIe 3.0 vs 2.0 is completely trivial. PCIe 3 doubles bandwidth from PCIe 2 (i.e. a version 3.0 x16 slot has 16 GB/sec bandwidth while a version 2.0 x16 slot has 8 GB/sec bandwidth, meaning a 3.0 x8 slot matches a 2.0 x16 slot). This Anandtech review goes over various games on an AMD 7970 with different link bandwidths. The difference between 2 GB/sec and 16 GB/sec is like 8% at most (with the exception of Dirt 3's minimum frame rate. This is probably a situation where textures are being swapped into VRAM from system RAM... SETI units shouldn't be swapping in and out of VRAM, if I understand correctly.) The difference between 8 GB/sec and 16 GB/sec will be virtually unnoticeable.

Basically the whole point of 3.0 is not "more maximum bandwidth" but "same bandwidth using fewer lanes so you can have more devices on an average mainboard".
3) Message boards : Number crunching : Wind tunnel cpu set up (Message 1326370)
Posted 10 Jan 2013 by Profile evilspoons
Post:
Why exactly does he have those bars on the inlet and exhaust towers baffles me.


Aesthetics. From the article:

The aluminum braces with steel tubes that are on the top of the intake and exhaust sections are decorative.


Incidentally, I have one of those Logitech keyboard + touchpad units and it is excellent for controlling a home theatre PC when the IR remote won't suffice. Much easier to use a touchpad on the couch than a mouse! My only complaint is it's starting to get a bit creaky after six months, but that's cheap plastic peripherals for you.
4) Message boards : Number crunching : How do I run the stock CPU apps with optimized GPU apps? (Message 1326219)
Posted 9 Jan 2013 by Profile evilspoons
Post:
So I guess the story right now is that there AREN'T any optimized CPU apps available thanks to the Lunatics legal business?
5) Message boards : Number crunching : How do I run the stock CPU apps with optimized GPU apps? (Message 1326176)
Posted 9 Jan 2013 by Profile evilspoons
Post:
OK... fine by me. Where would I find them? Lunatics currently isn't available, and I don't know of any other CPU optimized apps for Windows.
6) Message boards : Number crunching : How do I run the stock CPU apps with optimized GPU apps? (Message 1326170)
Posted 9 Jan 2013 by Profile evilspoons
Post:
Stupid noob question here, but I'm attempting to install the latest CUDA optimized apps and it seems to me that when I create an app_info.xml with the various pieces of information that come with the x41zc package, I will no longer have any app entries for the various stock CPU applications.

Is there a solution for this problem, perhaps a "default" app_info.xml file that I could use? Or am I missing something entirely?
7) Message boards : Number crunching : Can I use my GTX 680 with my old GTX 285? Optimized apps? (Message 1324606)
Posted 4 Jan 2013 by Profile evilspoons
Post:
Perfect, thanks for the tips on using the two cards together - I think I understand what needs to be done.


Any ideas on optimized apps for me? Like I said, the only ones I'm familiar with are Lunatics and they're not available at the moment (boo). The only other ones in the optimized apps thread seem to be for different OS/arch combinations.
8) Message boards : Number crunching : Can I use my GTX 680 with my old GTX 285? Optimized apps? (Message 1324144)
Posted 3 Jan 2013 by Profile evilspoons
Post:
Hi everyone,

I replaced my old GTX 285 in May with a GTX 680. The 285 has been sitting collecting dust since then and I was pondering putting it back in the computer as a PhysX coprocessor for gaming.

I was wondering if there are any issues with this setup for running BOINC (on Windows 7 x64). Will it recognize the two cards and run WUs on them automatically? Are there problems running the two different architectures side-by-side?

Finally, are there any easy-to-install optimized apps available for Seti that will help with:
1) CPU crunching on my i7 2600k,
2) GPU crunching with the GTX 680 alone,
and 3) GPU crunching in the event I can use the GTX 285 alongside the GTX 680?

My only experience with optimized apps thus far is the Lunatics stuff and, as everyone probably knows already, they're unavailable on Windows right now thanks to licensing issues.
9) Message boards : Number crunching : Smartphone Crunching (Reprise) (Message 1324131)
Posted 3 Jan 2013 by Profile evilspoons
Post:
Bear in mind that a computer's USB port only provides a maximum of 500mA. If you have a wall charger for the phone, it's probably capable of supplying more current...


Note that by its specification USB 3.0 ports can supply up to 900 mA when transferring data, and 1.5 A when in "battery charge" mode.
10) Message boards : Number crunching : Smartphone crunching (Message 1194955)
Posted 13 Feb 2012 by Profile evilspoons
Post:
I think they might be giving us batteries with 10 times the capacity, but it doesn't matter when the new device they go in gobbles up 1500 times more power.


Yeah, that too.

For what it's worth, CPU power usage is actually dropping nowadays... but GPUs continue to push the envelope of what is reasonable, and screens on portable devices, while getting more efficient, aren't using any less power because they're larger and brighter.
11) Message boards : Number crunching : Smartphone crunching (Message 1194946)
Posted 13 Feb 2012 by Profile evilspoons
Post:
Even Windows XP really is Windows 3.0 (or 3.1) through Windows 95/98 with a new interface towards the user and including a lot of new functionalities.


Windows 95 through Me were related, but they only true relationship they have to Windows 3.x is that they have some sort of DOS underneath (if you ignore a few Windows 3.x files kicking around for application compatibility).

Windows XP is based upon the Windows 2000 codebase, which is quite different from Windows 9x.

Windows Ultimate, for example, which I do in fact have myself, comes in both 32-bits and 64-bits versions.


"Windows Ultimate" is not a version of Windows. You're either talking about Windows 7 Ultimate or Windows Vista Ultimate, both of which are simply variations upon Windows 7 or Windows Vista with various features turned on that are unnecessary for the average user. It's mostly just a combination of the "home" stuff like Media Center and the "business" stuff like the ability to attach to a domain controller.


...

I also don't really see what this has to do with BOINC on smartphones. Back on topic:

What current nanotechnology advancing today will enable current batteries run 10 times more length than todays usual expectations in a few years. I hope people here read some tech news.


I do - batteries have ALWAYS been promised to have 10 times the capacity in a few years. Strangely enough, these amazing batteries never show up and revolutionize everything... because they've always been slowly improving. Fashion and practicality dictates that our batteries get smaller and lighter too. If you made the Macbook Air weigh as much as a laptop from 1995 you'd get 50 hours on a charge!

What I'm saying is batteries will continue to be as small and light as you can get away with in order to make the device more portable as opposed to a better calculation device. Why carry a brick around in your pocket when you've got a 30 lb desktop PC sitting at home with an infinite power reserve to do your math for you?
12) Message boards : Number crunching : Smartphone crunching (Message 1193288)
Posted 10 Feb 2012 by Profile evilspoons
Post:
Wouldn't this be cute though...

Computing preferences - Processor usage
Suspend work while vehicle is on battery power?
Matters only for electric vehicles

or

Computer information
Coprocessors: Audi (ULP) GeForce TEGRA 6 (1536MB)

Stranger than fiction:)


That would be clever, utilize the computer systems of an electric or plug-in hybrid car while it's sitting around doing nothing but charging at night.

You're probably already aware (and it was probably the inspiration for your post) but the current Audi A8's navigation system runs Google Earth on a Tegra 2 processor.
13) Message boards : Number crunching : Smartphone crunching (Message 1192601)
Posted 8 Feb 2012 by Profile evilspoons
Post:
the main problem with smart phones in general is they don'e have the math coprocessors that a PC has. You'll notice that the smart phone brags about having 1Ghz speed but it lacks the coprocessors that would heat that sucker like a pocket sized oven.

Raise your hand if you remember how big your HSF was on your 1Ghz PC. Now try and imagine that being compressed into a smartphone. It can't be done. The HSF is as big as the phone.


You're forgetting about process and design improvements: feature size (32 nm on Sandy Bridge Core iX CPUs vs 180 nm on the Coppermine 1 GHz Pentium III), lower voltage from the smaller feature size, lower leakage current when shut off, gating of unused sections, dynamic frequency selection based on thermal envelope...

I'm reasonably sure that a 1.3 GHz Core i3-2357M ULV laptop CPU can significantly outperform that Coppermine Pentium III... while dissipating only 17 watts vs Coppermine's 40+ watts.

EDIT: I found some more information about the relative performance of the i3-2357M. With two 1.3 GHz cores, it scores 1659 on PassMark. A 1.27 GHz PIII scores 309 - divide the i3's score and TDP half to go to one core, and we have the result of about 13x more PassMarks per watt since the Pentium III was top of the line (and we all had CPUs about that speed in our PCs.)

If you dial back the processing power a bit you end up with parts like AMD's new tablet APU with a 4.5 watt TDP that I'm sure are still competitive with that 1 GHz PIII.

It very certainly CAN be done.

However, you are right in that the design of the current CPUs living in most people's cell phones, despite being 1 GHz or whatever, aren't comparable (in terms of MFLOPS or benchmarks with distributed computing type stuff) to desktop CPUs with the same clock speed (new or old). They are missing all sorts of stuff, like huge memory bandwidth, and they have a totally different design philosophy because they're intended for different tasks.

Because of this I think running Seti@Home on my 1 GHz Samsung Galaxy S would be a terrible idea - not very efficient, and it would probably very quickly exceed the device's thermal capabilities. I know the phone gets incredibly hot when a program gets stuck in a loop without force closing. This heat is terrible for the battery and probably everything else in the phone.
14) Message boards : Number crunching : Breathing life into an old computer (Message 1190194)
Posted 30 Jan 2012 by Profile evilspoons
Post:
In a situation like this you would not crunch on the CPU, it's just used to drive the GPU.

A P4 is quite capable of keeping anything up to a GTX550 fed.

Experiments are continuing.

T.A.


Well, this is more interesting. BUT: keep in mind that the P4 will still be running, again, at that highly inefficient state to feed the GPU, OR it will be sitting idle, which is also highly inefficient - the low power mode of a P4 probably uses 30 or 40 more watts than a low power mode on a new cheapo CPU like a Pentium Dual Core, and then there's the question of whether it's actually entering that state for a reasonable amount of time or if Seti is keeping it in "on" mode.

If it's not able to effectively race-to-idle and go into a power saving sleep mode, look into underclocking and undervolting it until it is just barely fast enough to keep the GPU happy.
15) Message boards : Number crunching : Need a new Box -- (Message 1190192)
Posted 30 Jan 2012 by Profile evilspoons
Post:

----

Also, anti-virus is essentially free now for the home Windows user. Microsoft Security Essentials works great, costs nothing, and stays the hell out of the way (unlike other products).

Phew!

Is that why that 'problem' is still not fixed after over a decade since Bill Gates' "Trusted Computing" attempts? And also why the latest Windows includes a "factory reset" button so you can (more easily and more frequently) reinstall a factory clean version of your favourite Windows?


Back in the more normal world, I use a system completely without "anti-virus" and with not even a firewall. The system is not 'invulnerable' but neither does it actively support viruses and malware...


Happy clean crunchin',
Martin


I've only had MSE pop up once on my desktop and it was simply a warning that a file might be a threat (it was a network analyser program). Some people may interpret this as "it doesn't work", but there have been many standardized antivirus tests that put it at the top of the ranks among its competition.

Based upon my experiences I too could run Windows 7 without antivirus, but I choose not to because the performance impact is negligible and I might just slip up one day and run something stupid.

"Trusted Computing" takes away freedom from the user of the computer. I really don't want my files to be signed with a key I don't have access to in order for them to remain unmodified. It also works just fine in environments that enforce the usage of TC. The home desktop PC will never become one of those environments because you're essentially applying DRM to your entire system. We know how well restrictive DRM goes over on movies and games.

The need for doing a complete "system reset" has gone away for me; I honestly don't know what other people are doing. I haven't formatted one of my computers and reinstalled the same operating system in probably six years.

The only reason you don't have viruses on Linux is there aren't enough people using it. Android is Linux, Android is popular, and Android is starting to get malware and viruses.
16) Message boards : Number crunching : Need a new Box -- (Message 1190155)
Posted 30 Jan 2012 by Profile evilspoons
Post:
Are there any benchmarks floating around for Bulldozer on Seti@Home that I've missed? As far as I've read, a "Eight Core" AMD Bulldozer unit is slower than a quad core Intel i5/i7 unit at almost everything, and a "Six" core is going to be worse.

The problem is the "cores" in a Bulldozer aren't full cores, they have shared resources between each pair of cores. A six-core Bulldozer is actually a "three module" CPU with two threads per module that can (sometimes) run at full potential.

This is a design we're going to see more of in the future to keep complexity of new products down to a sane level, but in my opinion, the price point just isn't right for what AMD is currently offering.

----

Also, anti-virus is essentially free now for the home Windows user. Microsoft Security Essentials works great, costs nothing, and stays the hell out of the way (unlike other products).
17) Message boards : Number crunching : Breathing life into an old computer (Message 1190141)
Posted 30 Jan 2012 by Profile evilspoons
Post:
Now I just have to speed up the Pentium 4.


Bin it, and spend the money you save on its electricity bill on another GPU for the other machine.


Here we go.

I vote for this. A Pentium 4 is going to take ~100 watts running its CPU at 100%. It's going to run that CPU for 130 hours to finish a WU. My Core i5 750 finishes these WUs in 20 hours and uses 65 watts, making it TEN TIMES more efficient at crunching per unit of power used by the CPU.

How much does power cost in your area? You may just find that turning the P4 off for a month saves you enough money to buy a second GPU for your i7.
18) Message boards : Number crunching : Crunching w/Internal GPU of Sandy Bridge!!? (Message 1100505)
Posted 25 Apr 2011 by Profile evilspoons
Post:
No nothing at the moment I'm aware of, however Sandybridge does have AVX extensions which could greatly increase crunching speed and/or power and it is being worked on. Last I recall as well once you place a GPU into a Sandybridge machine it disables the onboard graphics on the chip. A lot of people in the video encoding world are a bit upset by that at the moment.


Some/most/all of the Z68 chipset mainboards will include a LucidLogix Virtu chip & driver, allowing the operating system access to both the Sandy Bridge onboard video for Quick Sync video encoding and a 'proper' video card for gaming. In certain configurations, it will even turn off the external GPU when in 2D-only mode for power savings.

It is possible the SB GPU will also be available for OpenCL in a situation like this, but since we aren't even sure it works for its advertised features yet, talking about possible EXTRA feaures seems a bit pointless.
19) Message boards : Number crunching : We'll see one day a GTX495 ? (Message 1033285)
Posted 16 Sep 2010 by Profile evilspoons
Post:
(3xx never made it)


NVidia makes a Geforce 310, 315, 320, 330, and 340. They called Fermi 4xx to give more breathing room for low-end part numbers in the 3xx range.

The 300s are not Fermi cards and are essentially OEM-only rebrandings of the GT215/216/218 chips that powered the Geforce 210/220/240. I don't know why the heck they bothered.

But... they did "make it".


As for the original subject, I believe I was reading (likely on Anandtech.com) that there won't be a dual-chip Fermi card without a die shrink due to power usage causing the card to exceed a reasonable thermal envelope and current load.
20) Message boards : Number crunching : Time-outs due to outage (Message 1033282)
Posted 16 Sep 2010 by Profile evilspoons
Post:
Fortunately, I've got 8 days before I run into the deadline issue on any of my pending uploads. I'd be donating at this point if I hadn't just put a down payment on a mortgage!


Next 20


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.