CUDA CRASH! Can GPU detection be disabled?

Questions and Answers : Unix/Linux : CUDA CRASH! Can GPU detection be disabled?
Message board moderation

To post messages, you must log in.

AuthorMessage
Profile ML1
Volunteer moderator
Volunteer tester

Send message
Joined: 25 Nov 01
Posts: 20084
Credit: 7,508,002
RAC: 20
United Kingdom
Message 985886 - Posted: 1 Apr 2010, 17:36:08 UTC
Last modified: 1 Apr 2010, 17:36:36 UTC

On Mandriva Linux with the latest nVidia drivers and while running the KDE4 desktop, it looks like xorg crashes at the point where Boinc looks for the nVidia GPU...

Can Boinc GPU detection be disabled?


With Boinc 6.10.43 running: Whilst in the login screen (KDM), there's a little screen corruption and the colours become psychedelically mangled, but everything seems to run ok. If in a full screen terminal, there is some pixel corruption at the top of the screen but again all runs well enough. However, with the KDE desktop running, the display window and keyboard immediately freeze as soon as Boinc gets to the point where it logs what GPU is found. With the frozen display, the only way out is to use the alt-sysreq keys or to remote log in via ssh to reboot or restart xorg.


Anyone have any clues before I start looking further?

Cheers,
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
ID: 985886 · Report as offensive
Profile Gundolf Jahn

Send message
Joined: 19 Sep 00
Posts: 3184
Credit: 446,358
RAC: 0
Germany
Message 985888 - Posted: 1 Apr 2010, 17:43:36 UTC - in response to Message 985886.  

Can Boinc GPU detection be disabled?

That would be <no_gpus>1</no_gpus> in cc_config.xml.

Gruß,
Gundolf
Computer sind nicht alles im Leben. (Kleiner Scherz)

SETI@home classic workunits 3,758
SETI@home classic CPU time 66,520 hours
ID: 985888 · Report as offensive
Profile ML1
Volunteer moderator
Volunteer tester

Send message
Joined: 25 Nov 01
Posts: 20084
Credit: 7,508,002
RAC: 20
United Kingdom
Message 985921 - Posted: 1 Apr 2010, 19:25:24 UTC - in response to Message 985888.  
Last modified: 1 Apr 2010, 19:26:15 UTC

Thanks for that. Now working but minus the GPU.

Interestingly:

[Einstein@Home] Application uses missing NVIDIA GPU


... and it is the 32-bit Einstein application that is the "new" feature and it is running on a 64-bit system... libcudart conflict between their 32bit and the system 64bit?


For anyone interested:

$ more cc_config.xml
<cc_config>
<options>
        <save_stats_days>365</save_stats_days>
        <no_gpus>1</no_gpus>
</options>
<log_flags>
        <http_debug>0</http_debug>
        <work_fetch_debug>0</work_fetch_debug>
        <debt_debug>0</debt_debug>
        <sched_op_debug>0</sched_op_debug>
        <cpu_sched_debug>0</cpu_sched_debug>
</log_flags>
</cc_config>



Happy crunchin',
Martin
See new freedom: Mageia Linux
Take a look for yourself: Linux Format
The Future is what We all make IT (GPLv3)
ID: 985921 · Report as offensive
Profile Jord
Volunteer tester
Avatar

Send message
Joined: 9 Jun 99
Posts: 15184
Credit: 4,362,181
RAC: 3
Netherlands
Message 985979 - Posted: 1 Apr 2010, 23:40:47 UTC - in response to Message 985921.  

Have you tried to install the 32bit shared libraries? All of Einstein's applications are 32bit, so in order to run those correctly on an all 64bit system, you have to install the (ia) 32bit shared library files.
ID: 985979 · Report as offensive

Questions and Answers : Unix/Linux : CUDA CRASH! Can GPU detection be disabled?


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.