Panic Mode On (108) Server Problems?

Message boards : Number crunching : Panic Mode On (108) Server Problems?
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 14 · 15 · 16 · 17 · 18 · 19 · 20 . . . 29 · Next

AuthorMessage
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1903534 - Posted: 29 Nov 2017, 15:56:47 UTC

Another attempt at splitting under way ... still all errors :((
ID: 1903534 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51464
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1903550 - Posted: 29 Nov 2017, 16:48:08 UTC - in response to Message 1903548.  

No work, no power consumption. Computer shut down.
Posting from a 4 watt Android tablet.

No hurry getting it fixed. I save money for every hour it's down :-)

Why not wait until next year to bring more work?

And what would be the fun in that?
Meow.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1903550 · Report as offensive
Profile Brent Norman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester

Send message
Joined: 1 Dec 99
Posts: 2786
Credit: 685,657,289
RAC: 835
Canada
Message 1903552 - Posted: 29 Nov 2017, 16:50:51 UTC - in response to Message 1903463.  

Apparently the Master Science database (that's the long-term storage of completed results, not the day-to-day BOINC server that we interact with every day) crashed after 241 days of continuous running. It's restarted, and appears to be running OK, but I guess they're keeping it lightly loaded overnight so they can run further tests in daylight.
I wonder if that is the actual cause of the splitter issue since the splitters failed 3 hours before maintenance when assimilation was still in progress for MB tasks.
ID: 1903552 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51464
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1903555 - Posted: 29 Nov 2017, 16:51:57 UTC - in response to Message 1903551.  

No work, no power consumption. Computer shut down.
Posting from a 4 watt Android tablet.

No hurry getting it fixed. I save money for every hour it's down :-)

Why not wait until next year to bring more work?

And what would be the fun in that?
Meow.

The fun was to write it, and to get reactions Mark :-)

Well, not much crunching fun going on right now.......have to stay entertained somehow.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1903555 · Report as offensive
Profile JaundicedEye
Avatar

Send message
Joined: 14 Mar 12
Posts: 5375
Credit: 30,870,693
RAC: 1
United States
Message 1903573 - Posted: 29 Nov 2017, 18:14:57 UTC

I thought it was just me..........and things were going so well before the outrage.........sigh.....

"Sour Grapes make a bitter Whine." <(0)>
ID: 1903573 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51464
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1903574 - Posted: 29 Nov 2017, 18:16:35 UTC - in response to Message 1903573.  

I thought it was just me..........and things were going so well before the outrage.........sigh.....

I woulda thought they might have had it sorted by now.
But not looking any better so far.
Meowsigh.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1903574 · Report as offensive
Profile JaundicedEye
Avatar

Send message
Joined: 14 Mar 12
Posts: 5375
Credit: 30,870,693
RAC: 1
United States
Message 1903576 - Posted: 29 Nov 2017, 18:19:06 UTC

Or as the VA says........Patients! Patients! Patients!

(sorry, not much else going on....)

"Sour Grapes make a bitter Whine." <(0)>
ID: 1903576 · Report as offensive
Grant (SSSF)
Volunteer tester

Send message
Joined: 19 Aug 99
Posts: 13370
Credit: 208,696,464
RAC: 304
Australia
Message 1903578 - Posted: 29 Nov 2017, 18:21:26 UTC

AP assimilators still off line, AP Validation still backing up- which was happening sInce late on Nov 26th.
Possibly related to present splitter issues, or just coincidence?
Grant
Darwin NT
ID: 1903578 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51464
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1903579 - Posted: 29 Nov 2017, 18:25:21 UTC - in response to Message 1903576.  

Or as the VA says........Patients! Patients! Patients!

(sorry, not much else going on....)

"And there is no joy in Setiland, all the splitters have struck out."
Meow.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1903579 · Report as offensive
Profile betreger Project Donor
Avatar

Send message
Joined: 29 Jun 99
Posts: 11147
Credit: 29,581,041
RAC: 66
United States
Message 1903580 - Posted: 29 Nov 2017, 18:27:41 UTC
Last modified: 29 Nov 2017, 18:27:53 UTC

The well is dry here and Einstein benefits.
ID: 1903580 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 750
Credit: 5,040,916
RAC: 28
United Kingdom
Message 1903581 - Posted: 29 Nov 2017, 18:30:52 UTC - in response to Message 1903579.  
Last modified: 29 Nov 2017, 18:31:24 UTC

Is there any way to have the 100 work unit limit in SETI@home extended? E.g. is there a config parameter I can tweak? I ran out during the maintenace slot and have only had a handful since.
ID: 1903581 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14505
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1903584 - Posted: 29 Nov 2017, 18:34:05 UTC - in response to Message 1903581.  

Is there any way to have the 100 work unit limit in SETI@home extended? E.g. is there a config parameter I can tweak? I ran out during the maintenace slot and have only had a handful since.
Buy a second GPU and use it for another project. My GTX 970s are working for GPUGrid, but I managed to top up to 200 tasks on each machine after the outage, so the 750 Tis still have plenty left.
ID: 1903584 · Report as offensive
Phil Burden

Send message
Joined: 26 Oct 00
Posts: 264
Credit: 22,303,899
RAC: 0
United Kingdom
Message 1903586 - Posted: 29 Nov 2017, 18:35:38 UTC - in response to Message 1903581.  
Last modified: 29 Nov 2017, 18:37:33 UTC

Is there any way to have the 100 work unit limit in SETI@home extended? E.g. is there a config parameter I can tweak? I ran out during the maintenace slot and have only had a handful since.


A question that has been posed many times over the years, and the simple answer is no. Seti is set to dish out a maximum of 100 tasks per CPU, and 100 tasks per GPU.
Plus you can also do as Richard suggests and let your pc work on other deserving projects ;-)

P.
ID: 1903586 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13157
Credit: 1,160,866,277
RAC: 1,873
United States
Message 1903587 - Posted: 29 Nov 2017, 18:37:55 UTC - in response to Message 1903581.  

No, nothing can be configured on our end. Project servers control the hard task limit. Only thing we can do on our end is to "bunker" tasks for the Tuesday outage using one of the SETI "Reschedulers" available here in this forum. A search will turn up your choices. I normally bunker enough to get mostly through the outage except for the Linux host. But with this breakdown yesterday and today every host is out of work. That is why it is a good idea to have a "backup" project to process when SETI falls over.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 1903587 · Report as offensive
Profile Piotr

Send message
Joined: 24 May 17
Posts: 18
Credit: 20,069,282
RAC: 41
Poland
Message 1903589 - Posted: 29 Nov 2017, 18:56:47 UTC

Come on, my flat starting to freeze and its below zero C outside !
ID: 1903589 · Report as offensive
Profile marsinph
Volunteer tester

Send message
Joined: 7 Apr 01
Posts: 172
Credit: 23,823,824
RAC: 0
Belgium
Message 1903591 - Posted: 29 Nov 2017, 19:02:06 UTC

No any explanation from Berkeley and staff !
Of course, they can not always publish. But a little word from staff helps !
I also not forget the time difference between Belgium and Berkeley (9 hours)
It will says it is already 10.00AM there.
Soon we perhaps will miss "The" signal we all of us are searching....
Because limitation of 100WU (understandable) i am in a few hours out of WU on all my computers..
And I ask work for 10 + 10 days.

So to conclude : it will be nice if staff post situation report.
Of course no reaction are needed on my post
Best regards from Belgium.
ID: 1903591 · Report as offensive
Profile David@home
Volunteer tester
Avatar

Send message
Joined: 16 Jan 03
Posts: 750
Credit: 5,040,916
RAC: 28
United Kingdom
Message 1903594 - Posted: 29 Nov 2017, 19:04:53 UTC

Thanks all for the feedback,

1) more GPUs, alas I cannot do that with my PC, it's prime use is photo editing and Photoshop cannot cope with multiple GPU cards
2) backup project, I have setup POGs with zero resource share. This seems to cure POG's bad behavoir of slowing filling the cache and squeezing out other projects
3) Rescheduler - must look into that. I always run out of CPU work and now with my new graphics card I run out of GPU work as well during the maintenance slot.

But I have an idea, I am sure Eric was asking for more crunchers recently as SETI needed more CPU power. Why not ask the SETI team to up this 100 limit so people can run 100% during the maintenance slot? Seems odd to have a limit so low, just increase it a bit should be enough.
ID: 1903594 · Report as offensive
Profile Keith Myers Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 29 Apr 01
Posts: 13157
Credit: 1,160,866,277
RAC: 1,873
United States
Message 1903597 - Posted: 29 Nov 2017, 19:10:12 UTC - in response to Message 1903594.  

That suggestion is made a lot too. I believe the original reason why it wasn't done is that the database couldn't handle the large increase in size if more tasks were out in the field. And considering our woes today is database related, probably a good idea to make no changes.
Seti@Home classic workunits:20,676 CPU time:74,226 hours

A proud member of the OFA (Old Farts Association)
ID: 1903597 · Report as offensive
juan BFP Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 16 Mar 07
Posts: 9786
Credit: 572,710,851
RAC: 3,799
Panama
Message 1903601 - Posted: 29 Nov 2017, 19:24:52 UTC
Last modified: 29 Nov 2017, 19:25:13 UTC

IIRC the 100 limit was post in place to protect the database when the GPUs are a lot less powerful than today´s and could crunch a WU in about 20- 30 min or more.

At that time you DL a lot of WU (up to you cache size most of us uses up to 10 days at the time) and that realy breaks the database limits.

So 100 WU holds for more than a day in most of the hosts. But now with the top GPU´s like the 1080Ti running Linux builds the crunch a WU in 5-6 min (or even less) they not last for even 5-6 hrs .

At that time we ask to rise the limit for at least 200 WU per GPU but our claim was never hearded.
ID: 1903601 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14505
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1903604 - Posted: 29 Nov 2017, 19:39:06 UTC

And considering our woes today is database related, probably a good idea to make no changes.
Different database, but the point is well made.

I am sure Eric was asking for more crunchers recently as SETI needed more CPU power.
Given that under normal circumstances, SETI is providing work for 158/168 hours in any given week, more crunchers is certainly the answer: allowing existing crunchers to hoard more work puts a lot of strain on the servers for very little added production.
ID: 1903604 · Report as offensive
Previous · 1 . . . 14 · 15 · 16 · 17 · 18 · 19 · 20 . . . 29 · Next

Message boards : Number crunching : Panic Mode On (108) Server Problems?


 
©2022 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.