> I've installed new S@H on my computer and I'm a little bit confused about
> measured speed:
> When I run setiBoinc on Linux 2.6.6, benchmark gave the following results:
> Measured floating point speed 984.01 million ops/sec
> Measured integer speed 2251.09 million ops/sec
> But the same test made by setiBoinc on WinXP gave:
> Measured floating point speed 2231.95 million ops/sec
> Measured integer speed 3980.85 million ops/sec
> Why do they differ so much? How can this affect credit granted for each WU?
One of the issues that is not discussed much anymore is the actual differences observed when the code is complied using different compilers. Historically, my experience with Microsoft compilers the have never turne dout the fastest code.
The only way to be sure you are geetting identical conversions and there for similar results is to use compilers that can cross-platform compile and make the builds and test them.
It is my understanding that the Visual studio is being used on the windows side but have no idea what is being used on the linux side, but it is either Gnu or possibly an intel compiler (I would guess Gnu cause it is cheaper, which does not imply worse).