Panic Mode On (34) Server Problems

Message boards : Number crunching : Panic Mode On (34) Server Problems
Message board moderation

To post messages, you must log in.

Previous · 1 . . . 8 · 9 · 10 · 11

AuthorMessage
Profile Miep
Volunteer moderator
Avatar

Send message
Joined: 23 Jul 99
Posts: 2412
Credit: 351,996
RAC: 0
Message 1011636 - Posted: 4 Jul 2010, 12:56:18 UTC - in response to Message 1011612.  

Well, looks like roughly a 12k/h deficit.
I find those spikes in up/downloads every 4 hours interesting.

Saying nobody can fill his cache is unfair to some 140,000 users with a RAC of 500 or less, to whom 20 WU would be enough for at least a 4 day cache.
That said the 27894 users with an RAC above 500 (thanks Robert Ribbeck, Scarecrow et.al.) are way below capacity and requirement.
So, an estimated 80% of active users got their piece of cake and will be happy - how about starting to give second helplings of the remaining cake?
At least that cake doesn't rot...

Seeing as it's Sunday and Monday a holiday in the US it looks highly unlikely to be enough if performing outage as scheduled.

Wonder if the electricity companies see a dip in power consumption?

I hope patience does not run out as fast as WUs... We'll need a lot more over the coming weeks.
Carola
-------
I'm multilingual - I can misunderstand people in several languages!
ID: 1011636 · Report as offensive
PP

Send message
Joined: 3 Jul 99
Posts: 42
Credit: 10,012,664
RAC: 0
Sweden
Message 1011643 - Posted: 4 Jul 2010, 13:05:03 UTC - in response to Message 1011636.  

For those interested - my AP only cruncher has 37 tasks in progress right now so the limit doesn't seem to apply to AP.
http://setiathome.berkeley.edu/results.php?hostid=5097356&offset=0&show_names=0&state=1
ID: 1011643 · Report as offensive
Profile Helli_retiered
Volunteer tester
Avatar

Send message
Joined: 15 Dec 99
Posts: 707
Credit: 108,785,585
RAC: 0
Germany
Message 1011644 - Posted: 4 Jul 2010, 13:05:55 UTC

yupp, but these 27,000 Users neets - i assume - 80% of all Workunits available.

Helli
ID: 1011644 · Report as offensive
Profile Bill Walker
Avatar

Send message
Joined: 4 Sep 99
Posts: 3868
Credit: 2,697,267
RAC: 0
Canada
Message 1011648 - Posted: 4 Jul 2010, 13:19:41 UTC - in response to Message 1011540.  

No problems with downloads here. I turn in a WU, and I get one back right away.


The same with me. That is fine while the download servers are turned on, but at this rate I expect to enter the next 3 day outrage with about 4 hours of cache. And I increased my cache setting to 5 days, from 3.

Question: the effects of all this on us users is obvious, but what is the effect on the project? Compared to a (recent) typical week of unplanned outages, is S@H processing more work, less work, or about the same? And how does the processing speed compare to the flow of new work from the telescope?

ID: 1011648 · Report as offensive
Profile Miep
Volunteer moderator
Avatar

Send message
Joined: 23 Jul 99
Posts: 2412
Credit: 351,996
RAC: 0
Message 1011653 - Posted: 4 Jul 2010, 13:35:26 UTC - in response to Message 1011644.  
Last modified: 4 Jul 2010, 13:48:09 UTC

yupp, but these 27,000 Users neets - i assume - 80% of all Workunits available.

Helli


From the RAC/users chart with added daily credits they need about 60%
EDIT: or perhaps 52% if the numbers are calculated differently *sigh*
Carola
-------
I'm multilingual - I can misunderstand people in several languages!
ID: 1011653 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51477
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1011702 - Posted: 4 Jul 2010, 16:05:07 UTC - in response to Message 1011648.  



Question: the effects of all this on us users is obvious, but what is the effect on the project? Compared to a (recent) typical week of unplanned outages, is S@H processing more work, less work, or about the same? And how does the processing speed compare to the flow of new work from the telescope?


It will probably take a few weeks of this new cycle and the subsequent 'fine tuning' to assess the impact on Seti workflow of folks like me who are going to stick around but may not be able to get enough work to continue to crunch 24/7, and those who choose to leave for their own personal reasons.
"Time is simply the mechanism that keeps everything from happening all at once."

ID: 1011702 · Report as offensive
Previous · 1 . . . 8 · 9 · 10 · 11

Message boards : Number crunching : Panic Mode On (34) Server Problems


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.