Don't know where it should go? Stick it here!

Message boards : Number crunching : Don't know where it should go? Stick it here!
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 80 · 81 · 82 · 83 · 84 · 85 · 86 . . . 147 · Next

AuthorMessage
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2004100 - Posted: 24 Jul 2019, 18:42:24 UTC - in response to Message 2004094.  

this says it all

<message>
finish file present too long</message>


it's a known problem with the BOINC client itself.

You will have to wait for DA to release the "imminent" (ha-ha-ha) BOINC 7.16.x client to get rid of the finish file present too long messages. That fix has been in the current master for months now.
I was certainly fed up with doing the work and getting no credit for the tasks because of these "no fault of my own" errors. I use a client that incorporates the fix since I compile my own client.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2004100 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2004101 - Posted: 24 Jul 2019, 18:49:17 UTC - in response to Message 2004095.  

Hi Ian,

What file does that refer to?

Have a great day! :)

Siran

It is referring to the BOINC finish file, IOW the computed output. The old code didn't allow enough time for a busy system to upload the result and clear the file out, so when it checks and still sees the file present, it pukes the message and invalidates your work.

#3017
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2004101 · Report as offensive     Reply Quote
Profile Siran d'Vel'nahr
Volunteer tester
Avatar

Send message
Joined: 23 May 99
Posts: 7379
Credit: 44,181,323
RAC: 238
United States
Message 2004104 - Posted: 24 Jul 2019, 19:15:24 UTC - in response to Message 2004101.  

Hi Ian,

What file does that refer to?

Have a great day! :)

Siran

It is referring to the BOINC finish file, IOW the computed output. The old code didn't allow enough time for a busy system to upload the result and clear the file out, so when it checks and still sees the file present, it pukes the message and invalidates your work.

#3017

Hi Keith,

Ah, so it's not through any fault of mine or the PC. :)

Thanks Keith and have a great day! :)

Siran
CAPT Siran d'Vel'nahr - L L & P _\\//
Winders 11 OS? "What a piece of junk!" - L. Skywalker
"Logic is the cement of our civilization with which we ascend from chaos using reason as our guide." - T'Plana-hath
ID: 2004104 · Report as offensive     Reply Quote
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22189
Credit: 416,307,556
RAC: 380
United Kingdom
Message 2004123 - Posted: 24 Jul 2019, 21:30:03 UTC

Spare a thought....
A few weeks ago I completed the thermal management design for the next "big project"multi-GPU, multi-CPU, multi-FPGA beast.
This was for 256 Quadro RTX 6000, 16 pairs of Cortex-10 8-core, 16 pairs of Xeon 16-core (Platinum version, real cores only, no hyper threading), 8 quads of Xeon 8-core (same again) all with "more than adequate" RAM and a load of FPGAs to do some custom stuff. Sitting alongside this were the 2 pairs of Xeon 8-core (you've guessed it) just doing the monitoring of the PSUs, fridges, air-flow management and the like). This lot is chilled-air cooled (air on at about 7C, off at about 30C, total dissipation about 85kw in a closed loop cooling arrangement with a capacity of >100kw so part of the cooling can be lost but maintain the net cooling until a repair can be undertaken).
The decision has been made that the Quadro will be RTX8000, and that there will be some detail changes in the FPGAs and physical layout of the clustering of the various processors. I've now got to re-run the cooling requirement calcs and airflow models to make sure cold air is going to get to where it is needed, and the hot air gets back to the chiller units without hitting anything that it shouldn't on the way.
Needless to say the commissioning date hasn't been moved.....

The guys writing the OS (a "perverted Linux which makes this lot look like 1 CPU plus a very large number of co-processors - I think it appears to be 1023 nvidia GPUs!!!)to the outside world) have been having "fun", and think they've got it stable at last. Meanwhile the guys writing the applications are getting to dry-run testing on a small subset of 8 Qudros plus the supporting processors. As I understand it the Cortex are acting as some sort of PCIe server, and while the first block of Xeons are split between data serving and data coagulation within the cells, and the other Xeons collate the data to and from the cells and service the storage interfaces.

(Now if only I could get that lot up onto SETI for a couple of days in mid/late August....)
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 2004123 · Report as offensive     Reply Quote
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13164
Credit: 1,160,866,277
RAC: 1,873
United States
Message 2004125 - Posted: 24 Jul 2019, 21:35:16 UTC

If only . . . . . . dreaming . . . Sounds like fun. LOL.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 2004125 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13731
Credit: 208,696,464
RAC: 304
Australia
Message 2004132 - Posted: 24 Jul 2019, 22:24:15 UTC - in response to Message 2004101.  

It is referring to the BOINC finish file, IOW the computed output. The old code didn't allow enough time for a busy system to upload the result and clear the file out, so when it checks and still sees the file present, it pukes the message and invalidates your work.#3017

I've had it when re-booting a system; BOINC not cleaning things up fast enough when the system is shutting down, so these days I exit BOINC, wait 10 or more seconds & then re-boot. Haven't had a Finish file present too long error since then.
Though it would be much nicer to actually have the problem fixed, instead of having to work around it each & every time.
Grant
Darwin NT
ID: 2004132 · Report as offensive     Reply Quote
Profile Siran d'Vel'nahr
Volunteer tester
Avatar

Send message
Joined: 23 May 99
Posts: 7379
Credit: 44,181,323
RAC: 238
United States
Message 2004208 - Posted: 25 Jul 2019, 9:52:01 UTC - in response to Message 2004132.  

It is referring to the BOINC finish file, IOW the computed output. The old code didn't allow enough time for a busy system to upload the result and clear the file out, so when it checks and still sees the file present, it pukes the message and invalidates your work.#3017

I've had it when re-booting a system; BOINC not cleaning things up fast enough when the system is shutting down, so these days I exit BOINC, wait 10 or more seconds & then re-boot. Haven't had a Finish file present too long error since then.
Though it would be much nicer to actually have the problem fixed, instead of having to work around it each & every time.

Hi Grant,

I think I know why I got that error. I have BOINC set up so that when it detects that I have started World of Warcraft, it suspends and then restarts after leaving WoW.

I agree that the fix needs to be in place to take care of it. :) My only other option is to suspend or shut down BOINC by hand before playing WoW and then start it back up by hand. :(

Have a great day! :)

Siran
CAPT Siran d'Vel'nahr - L L & P _\\//
Winders 11 OS? "What a piece of junk!" - L. Skywalker
"Logic is the cement of our civilization with which we ascend from chaos using reason as our guide." - T'Plana-hath
ID: 2004208 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2004215 - Posted: 25 Jul 2019, 12:19:15 UTC - in response to Message 2001674.  

before I moved to my open rack setup. I was running [7] GPUs all internal like this:
and this wasn't on SETI, but I ran less powerful GPUs in a row of [8] very similarly like this:
If you're using something low power like 1060s, you could cram 8 of them in the same fashion and run SETI. both setups have the lid off for pictures, but I ran them fully closed.


. . So how did the steaks grill on top of them? Did they char up OK? :)

Stephen

:)
ID: 2004215 · Report as offensive     Reply Quote
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2004216 - Posted: 25 Jul 2019, 13:28:55 UTC - in response to Message 2004215.  

before I moved to my open rack setup. I was running [7] GPUs all internal like this:
and this wasn't on SETI, but I ran less powerful GPUs in a row of [8] very similarly like this:
If you're using something low power like 1060s, you could cram 8 of them in the same fashion and run SETI. both setups have the lid off for pictures, but I ran them fully closed.


. . So how did the steaks grill on top of them? Did they char up OK? :)

Stephen

:)


Or just let them cook a little longer. Still will get tender :)

Tom
A proud member of the OFA (Old Farts Association).
ID: 2004216 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13731
Credit: 208,696,464
RAC: 304
Australia
Message 2004284 - Posted: 26 Jul 2019, 0:10:35 UTC
Last modified: 26 Jul 2019, 0:11:30 UTC

For those with an interest in CPUs- particularly TDP, frequency & overclocking, you really need to read this Interview at Annandtech.
Talking TDP, Turbo and Overclocking: An Interview with Intel Fellow Guy Therien
Guy Therien is one of those long time Intel ‘lifer’ engineers that seem to stick around for decades. He’s been at the company since February 1993, moving up through the various platform engineering roles until he was made a corporate fellow in January 2018. He holds 25 US patents, almost exclusively in the fields of processor performance states, power management, thermal management, thread migration, and power budget allocation. Guy has been with Intel as it has adapted its use of TDP on processors, and one of his most recent projects has been identifying how TDP and turbo should be implemented and interpreted by the OEMs and the motherboard partners.

Grant
Darwin NT
ID: 2004284 · Report as offensive     Reply Quote
Profile Tom M
Volunteer tester

Send message
Joined: 28 Nov 02
Posts: 5124
Credit: 276,046,078
RAC: 462
Message 2004309 - Posted: 26 Jul 2019, 3:23:25 UTC - in response to Message 2004284.  

For those with an interest in CPUs- particularly TDP, frequency & overclocking, you really need to read this Interview at Annandtech.
Talking TDP, Turbo and Overclocking: An Interview with Intel Fellow Guy Therien
Guy Therien is one of those long time Intel ‘lifer’ engineers that seem to stick around for decades. He’s been at the company since February 1993, moving up through the various platform engineering roles until he was made a corporate fellow in January 2018. He holds 25 US patents, almost exclusively in the fields of processor performance states, power management, thermal management, thread migration, and power budget allocation. Guy has been with Intel as it has adapted its use of TDP on processors, and one of his most recent projects has been identifying how TDP and turbo should be implemented and interpreted by the OEMs and the motherboard partners.


+1

Reading now.
Tom
A proud member of the OFA (Old Farts Association).
ID: 2004309 · Report as offensive     Reply Quote
Profile Bernie Vine
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 26 May 99
Posts: 9954
Credit: 103,452,613
RAC: 328
United Kingdom
Message 2004491 - Posted: 27 Jul 2019, 8:24:26 UTC
Last modified: 27 Jul 2019, 8:25:46 UTC

Today as most days one of the first things I did was to check Boinc Tasks, to find to my surprise that my 2nd Win machine (the one I put t new MB in earlier this year) was disconnected,

So switching on the screen I find that it has rebooted, and is waiting for me to sign in. This is odd because since the new MB and a total re-install of Windows it has not been a problem. I log in, and thinking it may have been a Windows update. I go to kook at the Win event log, before I can click on it, screen goes blank. Nothing I can do will restore it. A feeling of impending doom!! I restart it by holding in the power button, it reboots, I log in and almost immediately black screen. I try to remote in, which works, as it asks for and accepts my PW, but all I get is a black screen. It is then I notice that it is showing in Boinc Tasks, however whilst the two CPU tasks are progressing the GPU task is static not progressing, I then notice that there are 27 GPU tasks "waiting to run". It would seem my GTX 970 has died. Now I know that 5 year old GPU's subject to 24 hour heavy use do fail. However this card was special to me.

Back when the 900 series first came out I was looking for new GPU, so I asked on these forums how people rated the GTX950 and GTX960 in terms of value for money,. At the time I was watching the pennies so really a 950 was the best I could afford. That same day I got a PM from another setizen, who said he had a spare GTX970 and would I be interested. Well yes I was. I pm'd him back asking how much he wanted. His reply totally surprised me, he didn't want anything, he had updated his GPU's to 980 and would be happy that another cruncher could put it to good use. Which I did. This was late 2014 and it has been either gaming or crunching 24/7 since.

He sadly left SETI after a discussion on Nitpicker convinced him SETI@Home was a total waste of time. Sad

RIP GTX 970

So if you have read this far you will know I need a new GPU, question is , a 970 for nostalgia or a 1060 for an update?
ID: 2004491 · Report as offensive     Reply Quote
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13731
Credit: 208,696,464
RAC: 304
Australia
Message 2004493 - Posted: 27 Jul 2019, 8:48:06 UTC - in response to Message 2004491.  
Last modified: 27 Jul 2019, 8:58:36 UTC

So if you have read this far you will know I need a new GPU, question is , a 970 for nostalgia or a 1060 for an update?
What you really need to do is decide how much you're prepared to spend, and decide what's your absolute not to go past upper limit.
Then check Shaggie's graphs, then check out the prices of what is available.

From memory you have a GTX 1660Ti? And from a price/performance efficiency point of view it's hard to beat.
If a GTX 1060 is considerably (and I mean a lot) cheaper than the GTX 1660Ti, it would certainly be a much better choice than any GTX 970 IMHO.
But if you've got a bit more money the RTX 2060 is a excellent worker, and with all the new cards recently announced & released you might be able to get a good price on one...
Grant
Darwin NT
ID: 2004493 · Report as offensive     Reply Quote
Profile tullio
Volunteer tester

Send message
Joined: 9 Apr 04
Posts: 8797
Credit: 2,930,782
RAC: 1
Italy
Message 2004494 - Posted: 27 Jul 2019, 8:50:00 UTC

I have a GTX 1050 with no additional power connector on a 8.1 Windows PC and a GTX 1060 with additional power connector on a Windows 10 PC. I have a Linux Virtual Machine with SuSE Tumbleweed on the 8.1 PC but it does not see the GTX 1050, so no GPU tasks on that Virtual Machine.
Tullio
ID: 2004494 · Report as offensive     Reply Quote
rob smith Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer moderator
Volunteer tester

Send message
Joined: 7 Mar 03
Posts: 22189
Credit: 416,307,556
RAC: 380
United Kingdom
Message 2004495 - Posted: 27 Jul 2019, 8:51:53 UTC

The 970 is still a fair GPU, the 1060 is possibly a bit better.
Nostalgia? I thhink it is up to the owner to decide not for others to laugh..
Bob Smith
Member of Seti PIPPS (Pluto is a Planet Protest Society)
Somewhere in the (un)known Universe?
ID: 2004495 · Report as offensive     Reply Quote
Profile B. Ahmet KIRAN

Send message
Joined: 19 Oct 14
Posts: 77
Credit: 36,140,903
RAC: 140
Turkey
Message 2004500 - Posted: 27 Jul 2019, 11:36:38 UTC

I am having big trouble downloading files since last week... Could this be due to my earlier upgrade to windows 10? I do have two other machines also newly upgraded to 10 but none of them shows this incredible trouble... Can anyone help?
ID: 2004500 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2004504 - Posted: 27 Jul 2019, 13:06:51 UTC - in response to Message 2004491.  

RIP GTX 970

So if you have read this far you will know I need a new GPU, question is , a 970 for nostalgia or a 1060 for an update?

. . Hi Bernie,

. . Well I am quite fond of my GTX970s too, but my 1060-6GB cards match them in performance, use less power, run cooler and probably represent better value for money. If that is of any help to you.

Stephen

:)
ID: 2004504 · Report as offensive     Reply Quote
Stephen "Heretic" Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 20 Sep 12
Posts: 5557
Credit: 192,787,363
RAC: 628
Australia
Message 2004505 - Posted: 27 Jul 2019, 13:12:46 UTC - in response to Message 2004493.  

What you really need to do is decide how much you're prepared to spend, and decide what's your absolute not to go past upper limit.
Then check Shaggie's graphs, then check out the prices of what is available.
From memory you have a GTX 1660Ti? And from a price/performance efficiency point of view it's hard to beat.
If a GTX 1060 is considerably (and I mean a lot) cheaper than the GTX 1660Ti, it would certainly be a much better choice than any GTX 970 IMHO.
But if you've got a bit more money the RTX 2060 is a excellent worker, and with all the new cards recently announced & released you might be able to get a good price on one...

. . This is probably not a lot of help but might give an indication of value out there, Kogan recently had a special on ASUS RTX2070 cards for $399 AUD, sadly I was too late and missed the boat. A pair of those would have been ... wait for it ... AWESOME! Sorry, for some reason I just channelled HOMYM ...

Stephen

:(
ID: 2004505 · Report as offensive     Reply Quote
Profile Bernie Vine
Volunteer moderator
Volunteer tester
Avatar

Send message
Joined: 26 May 99
Posts: 9954
Credit: 103,452,613
RAC: 328
United Kingdom
Message 2004522 - Posted: 27 Jul 2019, 15:04:33 UTC
Last modified: 27 Jul 2019, 15:05:13 UTC

Well after a lot of web browsing( some of the prices!!!) I decided on a GTX 1660 (not a Ti as those two letter push it out of my price range) ;-)

https://www.amazon.co.uk/gp/product/B07PNH3C4N/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&psc=1

I had to keep the size down as I found out when using the temporary 750ti, long cards foul the sata sockets!!

I decided as the MB is only 6 months old why not splash out on a new GPU as well.

I will be interested to see how it compares to the Ti in my other Win machine
ID: 2004522 · Report as offensive     Reply Quote
Ian&Steve C.
Avatar

Send message
Joined: 28 Sep 99
Posts: 4267
Credit: 1,282,604,591
RAC: 6,640
United States
Message 2004525 - Posted: 27 Jul 2019, 15:36:34 UTC - in response to Message 2004522.  

I have a 1660 (non-ti), runs about as fast as a 1070 with lower power consumption.
Seti@Home classic workunits: 29,492 CPU time: 134,419 hours

ID: 2004525 · Report as offensive     Reply Quote
Previous · 1 . . . 80 · 81 · 82 · 83 · 84 · 85 · 86 . . . 147 · Next

Message boards : Number crunching : Don't know where it should go? Stick it here!


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.