Setting up Linux to crunch CUDA90 and above for Windows users

Message boards : Number crunching : Setting up Linux to crunch CUDA90 and above for Windows users
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 115 · 116 · 117 · 118 · 119 · 120 · 121 . . . 162 · Next

AuthorMessage
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2001976 - Posted: 10 Jul 2019, 18:39:54 UTC - in response to Message 2001975.  

Ha ha LOL. You've been dropping hints for over a month now about 7.16 being imminent. Still no sign of it. I guess DA is being more cautious than usual.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2001976 · Report as offensive     Reply Quote
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 2001980 - Posted: 10 Jul 2019, 18:52:54 UTC - in response to Message 2001976.  

Ha ha LOL. You've been dropping hints for over a month now about 7.16 being imminent. Still no sign of it. I guess DA is being more cautious than usual.
They were more than hints - I was passing on information I'd heard in conference calls. But it seems that David has too many irons in the fire these days - interestingly, he hasn't turned up for this workshop, even though he's been a permanent fixture at every previous one.

Though to be fair, he was working on the Nebula post-processing pipeline yesterday.
ID: 2001980 · Report as offensive     Reply Quote
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2001997 - Posted: 10 Jul 2019, 21:12:42 UTC - in response to Message 2001980.  


Though to be fair, he was working on the Nebula post-processing pipeline yesterday.


That means it is time for a pun. "The END is near!" aka: post-processing.....











Unsigned Integer
A proud member of the OFA (Old Farts Association).
ID: 2001997 · Report as offensive     Reply Quote
Oddbjornik Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 15 May 99
Posts: 220
Credit: 349,610,548
RAC: 1,728
Norway
Message 2002010 - Posted: 10 Jul 2019, 22:05:17 UTC - in response to Message 2001997.  

Unsigned Integer
++;
ID: 2002010 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002022 - Posted: 10 Jul 2019, 23:43:02 UTC - in response to Message 2001967.  

I got rid of the standard BOINC installation on my Nano by digging out all the old references, permissions and symlinks so I could move the BOINC data folder to /home. A lot more work than just doing a purge though. At least I own BOINC now and can do with it what I want. Couldn't use the AIO obviously because of the platform differences.


. . OK, that sounds like a lot of work with many possible pitfalls, I am guessing it is easier to leave the repository version running on the Core@ Duo. But I will certainly keep to the alternate versions on any future install .

Stephen
ID: 2002022 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2002027 - Posted: 11 Jul 2019, 0:13:35 UTC - in response to Message 2002022.  

I got rid of the standard BOINC installation on my Nano by digging out all the old references, permissions and symlinks so I could move the BOINC data folder to /home. A lot more work than just doing a purge though. At least I own BOINC now and can do with it what I want. Couldn't use the AIO obviously because of the platform differences.


. . OK, that sounds like a lot of work with many possible pitfalls, I am guessing it is easier to leave the repository version running on the Core@ Duo. But I will certainly keep to the alternate versions on any future install .

Stephen

Why don't you follow Richard's technique from a few posts earlier. You get to keep the distro repo location but just update it with the AIO applications and BOINC.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2002027 · Report as offensive     Reply Quote
Profile Bernie Vine
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 26 May 99
Posts: 9954
Credit: 103,452,613
RAC: 328
United Kingdom
Message 2002124 - Posted: 11 Jul 2019, 17:54:15 UTC

So I decided to go with the "nuke" the install and start again, this will be my final(honest) Linux box, well I say "box", it is currently just a MB, PSU and an SSD sitting on a box in a dark corner.

The MB is so old that I didn't think it would be up to the job.

https://setiathome.berkeley.edu/show_host_detail.php?hostid=8762732

However it performed well with and old GT 640 and a standard Boinc install I decided to go the whole hog install the AIO and get another 750ti.
ID: 2002124 · Report as offensive     Reply Quote
Loren Datlof

Send message
Joined: 24 Jan 14
Posts: 73
Credit: 19,652,385
RAC: 0
United States
Message 2002162 - Posted: 11 Jul 2019, 22:06:01 UTC

Yesterday i downgraded a machine to the 390 driver and the CUDA90 special app. Unfortunately the downgrade didn't go as i expected and I had to reload the OS and I ended up with 100 errors.

So now I have one host with a GTX 750 running CUDA90 and another host with a GTX 750 Ti running CUDA101.

CUDA90 host: https://setiathome.berkeley.edu/show_host_detail.php?hostid=8762059

CUDA101 host: https://setiathome.berkeley.edu/show_host_detail.php?hostid=8667487

I am not sure if a 750 vs. a 750 Ti is a direct apples to apples comparison but I imagine it is close.
ID: 2002162 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002175 - Posted: 11 Jul 2019, 23:25:32 UTC - in response to Message 2002124.  

So I decided to go with the "nuke" the install and start again, this will be my final(honest) Linux box, well I say "box", it is currently just a MB, PSU and an SSD sitting on a box in a dark corner.

The MB is so old that I didn't think it would be up to the job.

https://setiathome.berkeley.edu/show_host_detail.php?hostid=8762732

However it performed well with and old GT 640 and a standard Boinc install I decided to go the whole hog install the AIO and get another 750ti.


. . I am surprised you can source so many 750/750ti cards. Looking around out here there are none available when I search for them.

. . Have fun with that ... :)

Stephen

. . Remember - crunch responsibly, don't crunch over your head, think of your families .... :)

{That is paraphrased from signs here (by law) in poker machine/gambling areas of our clubs.}
ID: 2002175 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002176 - Posted: 11 Jul 2019, 23:34:51 UTC - in response to Message 2002162.  

Yesterday i downgraded a machine to the 390 driver and the CUDA90 special app. Unfortunately the downgrade didn't go as i expected and I had to reload the OS and I ended up with 100 errors.

So now I have one host with a GTX 750 running CUDA90 and another host with a GTX 750 Ti running CUDA101.

CUDA90 host: https://setiathome.berkeley.edu/show_host_detail.php?hostid=8762059

CUDA101 host: https://setiathome.berkeley.edu/show_host_detail.php?hostid=8667487

I am not sure if a 750 vs. a 750 Ti is a direct apples to apples comparison but I imagine it is close.


. . Generally yes, very close, but that is offset by the fact one is running on a decent quad core CPU while the other is running on a barely sufficient Dual core CPU which is also trying to crunch at the same time. That will have an impact.

Stephen

:(
ID: 2002176 · Report as offensive     Reply Quote
TBar
Volunteer tester

Send message
Joined: 22 May 99
Posts: 5204
Credit: 840,779,836
RAC: 2,768
United States
Message 2002179 - Posted: 12 Jul 2019, 0:15:58 UTC

You should use the Benchmarking App for comparisons, and use devices that can actually Test the CUDA version of the Apps. This means you don't Test a CUDA 9 App with a Turing GPU since a Turing GPU Can't run CUDA 9. The first time you run a CUDA 9 App on a Turing GPU the Driver "colludes" with the App to build new code that can run on a Turing GPU. i.e. CUDA 10. So, it Appears CUDA 9 & 10 Apps are the same on Turing, because, they are the same to the Turing GPU...it's running CUDA 10 in Both cases. Now, a Pascal GPU Can run CUDA 9, it can also run CUDA 10, so a test with a Pascal or older card would be valid. This Test was with a GTX 1050 using the BenchMark App;

----------------------------------------------------------------
Starting benchmark run...
----------------------------------------------------------------
Listing wu-file(s) in /testWUs :
05dc06ac.20406.7843.14.41.72.wu
24my14ab.7469.476.12.39.186.wu
26ja07aa.30143.294304.16.43.191.wu
30dc06ah.26541.18886.12.46.6.wu
blc22_2bit_guppi_58405_80305_HIP85620_0011.8907.818.21.44.173.vlar.wu
blc25_2bit_guppi_58340_44419_HIP3373_0047.29884.818.20.29.97.vlar.wu
blc25_2bit_guppi_58406_27943_HIP20917_0106.14763.818.22.45.17.vlar.wu
blc32_2bit_guppi_58406_17412_HIP2473_0076.2911.818.23.46.76.vlar.wu
blc35_2bit_guppi_58406_02921_HIP116398_0037.24560.818.22.45.57.vlar.wu

Listing executable(s) in /APPS :
setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101

Listing executable in /REF_APPS :
setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90
----------------------------------------------------------------
Current WU: 05dc06ac.20406.7843.14.41.72.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 258 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 244 seconds
Speed compared to default : ......... 105 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with 05dc06ac.20406.7843.14.41.72.wu
====================================================================
Current WU: 24my14ab.7469.476.12.39.186.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 150 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 140 seconds
Speed compared to default : ......... 107 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with 24my14ab.7469.476.12.39.186.wu
====================================================================
Current WU: 26ja07aa.30143.294304.16.43.191.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 106 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 97 seconds
Speed compared to default : ......... 109 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with 26ja07aa.30143.294304.16.43.191.wu
====================================================================
Current WU: 30dc06ah.26541.18886.12.46.6.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 109 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 100 seconds
Speed compared to default : ......... 109 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with 30dc06ah.26541.18886.12.46.6.wu
====================================================================
Current WU: blc22_2bit_guppi_58405_80305_HIP85620_0011.8907.818.21.44.173.vlar.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 230 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 221 seconds
Speed compared to default : ......... 104 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with blc22_2bit_guppi_58405_80305_HIP85620_0011.8907.818.21.44.173.vlar.wu
====================================================================
Current WU: blc25_2bit_guppi_58340_44419_HIP3373_0047.29884.818.20.29.97.vlar.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 316 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 305 seconds
Speed compared to default : ......... 103 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with blc25_2bit_guppi_58340_44419_HIP3373_0047.29884.818.20.29.97.vlar.wu
====================================================================
Current WU: blc25_2bit_guppi_58406_27943_HIP20917_0106.14763.818.22.45.17.vlar.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 245 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 236 seconds
Speed compared to default : ......... 103 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with blc25_2bit_guppi_58406_27943_HIP20917_0106.14763.818.22.45.17.vlar.wu
====================================================================
Current WU: blc32_2bit_guppi_58406_17412_HIP2473_0076.2911.818.23.46.76.vlar.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 261 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 251 seconds
Speed compared to default : ......... 103 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with blc32_2bit_guppi_58406_17412_HIP2473_0076.2911.818.23.46.76.vlar.wu
====================================================================
Current WU: blc35_2bit_guppi_58406_02921_HIP116398_0037.24560.818.22.45.57.vlar.wu
----------------------------------------------------------------
Running default app with command :... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda90 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time: ....................... 287 seconds
----------------------------------------------------------------
Running app with command : .......... setiathome_x41p_V0.98b1_x86_64-pc-linux-gnu_cuda101 -nobs -device 0
gCudaDevProps.multiProcessorCount = 5
Elapsed Time : ...................... 278 seconds
Speed compared to default : ......... 103 %
-----------------
Comparing results
Result      : Strongly similar,  Q= 100.0%
----------------------------------------------------------------
Done with blc35_2bit_guppi_58406_02921_HIP116398_0037.24560.818.22.45.57.vlar.wu
====================================================================
Done with Benchmark run! Removing temporary files!
CUDA 10.1 wins every time on My GTX 1050, YMMV.
If you look in the AIO Folder there is a similar Test with v0.97, as with v0.98, the CUDA 10 App wins every time.
ID: 2002179 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2002342 - Posted: 13 Jul 2019, 1:08:49 UTC

Looks like Ubuntu 18.04 LTS is going to get the latest Nvidia proprietary drivers in the main distro from now on. No need to install the ppa repository. The very latest 430 series drivers are available in the
-proposed sources that you toggle on in the Software&Updates application Developer Options page.

I think the 418 drivers are already in the Main distro.

This article and video over at OMG Ubuntu talks about it.

https://www.omgubuntu.co.uk/2019/07/install-nvidia-driver-update-ubuntu-its
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2002342 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002347 - Posted: 13 Jul 2019, 1:33:01 UTC - in response to Message 2002342.  

Looks like Ubuntu 18.04 LTS is going to get the latest Nvidia proprietary drivers in the main distro from now on. No need to install the ppa repository. The very latest 430 series drivers are available in the
-proposed sources that you toggle on in the Software&Updates application Developer Options page.
I think the 418 drivers are already in the Main distro.
This article and video over at OMG Ubuntu talks about it.
https://www.omgubuntu.co.uk/2019/07/install-nvidia-driver-update-ubuntu-its


. . that is good news ... but on that subject when I installed the PPA on this rig it asked me to run benchmarks to aid in the driver development. That sounded fair but was a BIG mistake, Phoronix installed and ran but afterwards I could not run SETI because Phoronix had used up the 3.8 GB of free space on this 16GB flashdrive. So I thought, don't panic just remove Phoronix and do an autoclean, autoremove! Well that only recovered 400Mb, I then did a reboot and this took it up to 640MB free, so with a little tweaking of BOINC limits I was able to get it running again. But bit by bit it then dropped to below 400MB again, so I had to clear out some junk from my odds and ends folder to keep going.

. . So, I have now decided to migrate BOINC/SETI (well the whole shebang actually) onto a 120GB SSD drive. This seems like a good time to update the OS (I still want to fly with Lubuntu because I like the look/feel so much more) and maybe move away from the repository version of BOINC to one of TBars later versions. So if I set up the new drive with Lubuntu and install the new BOINC app can I migrate the SETI account from the old repository BOINC on the flashdrive to the new BOINC without losing anything?

Stephen

? ?
ID: 2002347 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2002359 - Posted: 13 Jul 2019, 2:30:29 UTC - in response to Message 2002347.  

You must have not looked at what you were installing from the ppa. I have never had the phoronix test suite installed by the ppa if you are just adding the repository.

Yes, there is a command on the ppa repository page to install the phoronix test suite for benchmarking and testing purposes. But if you have to specifically invoke the command to install the test suite. It does not state anywhere that you need to install the test suite to install the ppa.

Someone just asked this today elsewhere. Yes just copy the account* files for the distro BOINC folder to the AIO folder. That way you won't have to join again.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2002359 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002392 - Posted: 13 Jul 2019, 14:27:29 UTC - in response to Message 2002359.  
Last modified: 13 Jul 2019, 14:36:31 UTC

You must have not looked at what you were installing from the ppa. I have never had the phoronix test suite installed by the ppa if you are just adding the repository.

Yes, there is a command on the ppa repository page to install the phoronix test suite for benchmarking and testing purposes. But if you have to specifically invoke the command to install the test suite. It does not state anywhere that you need to install the test suite to install the ppa.

Someone just asked this today elsewhere. Yes just copy the account* files for the distro BOINC folder to the AIO folder. That way you won't have to join again.


. . No I didn't say it was 'necessary'! After I had installed the ppa it asked if I would participate and assist the developers by running some benchmarks before and after updating the video drivers, I did not anticipate something the size of the full Phoronix test suite :( {Silly me!} so I said yes and the rest is history.

. . Is it really that simple? With the repo version the files are scattered over several folders, at least 2, in /root. But does just copying them (all) from those two folders into the new BOINC folder allow it to work with the new version of BOINC? I really do not want to lose the host ID and its history ...

Stephen

? ?
ID: 2002392 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2002394 - Posted: 13 Jul 2019, 15:04:41 UTC - in response to Message 2002392.  

No, all the files. You don't want the repo version of BOINC after all. The reason is you are changing to the AIO version. Just the account files that identify you and the configuration files.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2002394 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2002395 - Posted: 13 Jul 2019, 15:13:02 UTC - in response to Message 2002392.  

. . No I didn't say it was 'necessary'! After I had installed the ppa it asked if I would participate and assist the developers by running some benchmarks before and after updating the video drivers, I did not anticipate something the size of the full Phoronix test suite :( {Silly me!} so I said yes and the rest is history.

I still don't know how you triggered that. Must be because of your version of the distro or something.

I've never had installing the repo trigger a question to install the phoronix suite. Neither have I ever heard of such from anyone else. And that counts dozens of people. You must be special Stephen :->)
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2002395 · Report as offensive     Reply Quote
TBar
Volunteer tester

Send message
Joined: 22 May 99
Posts: 5204
Credit: 840,779,836
RAC: 2,768
United States
Message 2002410 - Posted: 13 Jul 2019, 16:37:41 UTC - in response to Message 2002342.  
Last modified: 13 Jul 2019, 17:11:16 UTC

Looks like Ubuntu 18.04 LTS is going to get the latest Nvidia proprietary drivers in the main distro from now on. No need to install the ppa repository. The very latest 430 series drivers are available in the
-proposed sources that you toggle on in the Software&Updates application Developer Options page.

I think the 418 drivers are already in the Main distro.
This article and video over at OMG Ubuntu talks about it.
https://www.omgubuntu.co.uk/2019/07/install-nvidia-driver-update-ubuntu-its
Not only that...
I gave this a try on my 18.04.2 system and it offered an update to the Kernel 5.0.0-21. The update to 5.0.0-21 is new to 19.04 too, so, I'm assuming it is being passed to 18.04 because of the Developer Options change. So far it seems to be working OK,
Linux Ubuntu
Ubuntu 18.04.2 LTS [5.0.0-21-generic|libc 2.27 (Ubuntu GLIBC 2.27-3ubuntu1)]


I would give 430 a try, but, I got the Mining case yesterday and I'm deep into looking it over and pondering the possibilities...
I did test driver 410.104 with cuda 9 & 10.0 and it was the same as with 10.1, cuda 9 is better with driver 410.104 but 10.0 still beats it every time on my 1050.

BTW, I also got a new 64GB USB 3.1 stick yesterday....for about $7. I'm using a $27 120GB SSD for a couple of systems. There isn't any reason to keep using a too small drive and keep suffering problems.
ID: 2002410 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002443 - Posted: 13 Jul 2019, 22:24:33 UTC - in response to Message 2002395.  

. . No I didn't say it was 'necessary'! After I had installed the ppa it asked if I would participate and assist the developers by running some benchmarks before and after updating the video drivers, I did not anticipate something the size of the full Phoronix test suite :( {Silly me!} so I said yes and the rest is history.

I still don't know how you triggered that. Must be because of your version of the distro or something.

I've never had installing the repo trigger a question to install the phoronix suite. Neither have I ever heard of such from anyone else. And that counts dozens of people. You must be special Stephen :->)


. . Maybe it's the tattoo on my forehead saying "Sucker!" :)

Stephen

:(
ID: 2002443 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2002446 - Posted: 13 Jul 2019, 22:36:04 UTC - in response to Message 2002410.  

Not only that...
I gave this a try on my 18.04.2 system and it offered an update to the Kernel 5.0.0-21. The update to 5.0.0-21 is new to 19.04 too, so, I'm assuming it is being passed to 18.04 because of the Developer Options change. So far it seems to be working OK,
Linux Ubuntu
Ubuntu 18.04.2 LTS [5.0.0-21-generic|libc 2.27 (Ubuntu GLIBC 2.27-3ubuntu1)]

I would give 430 a try, but, I got the Mining case yesterday and I'm deep into looking it over and pondering the possibilities...
I did test drive 410.104 with cuda 9 & 10.0 and it was the same as with 10.1, cuda 9 is better with driver 410.104 but 10.0 still beats it every time on my 1050.
BTW, I also got a new 64GB USB 3.1 stick yesterday....for about $7. I'm using a $27 120GB SSD for a couple of systems. There isn't any reason to keep using a too small drive and keep suffering problems.


. . That is where I am at. It has been doing fine until my Phoronix acceptance blunder but clearly it is time to move up. Especially with the prospect of updating to the later releases of Linux and BOINC.

Stephen

< bullet biting time >
ID: 2002446 · Report as offensive     Reply Quote
Previous · 1 . . . 115 · 116 · 117 · 118 · 119 · 120 · 121 . . . 162 · Next

Message boards : Number crunching : Setting up Linux to crunch CUDA90 and above for Windows users


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.