Message boards :
News :
SETI@home hibernation
Message board moderation
Previous · 1 . . . 33 · 34 · 35 · 36 · 37 · 38 · 39 . . . 46 · Next
Author | Message |
---|---|
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13855 Credit: 208,696,464 RAC: 304 |
Thanks Grant. I just picked up 3 WU's from Rosetta so I'll keep an eye on what happens. Much appreciated.Keep an eye on your BOINC Manager Event log. As well as roughly 1.3GB RAM per Task being run (although the present ones are only using between 300-750MB), it also needs plenty of HDD space, and if it doesn't have enough you'll have messages in the Event log telling you how much more is needed. I've got 12 Tasks running, with just over 11GB of HDD space in use. Grant Darwin NT |
Tom M Send message Joined: 28 Nov 02 Posts: 5126 Credit: 276,046,078 RAC: 462 |
Thanks Grant. I just picked up 3 WU's from Rosetta so I'll keep an eye on what happens. Much appreciated.Keep an eye on your BOINC Manager Event log. As well as roughly 1.3GB RAM per Task being run (although the present ones are only using between 300-750MB), it also needs plenty of HDD space, and if it doesn't have enough you'll have messages in the Event log telling you how much more is needed. I've got 12 Tasks running, with just over 11GB of HDD space in use. I am beginning to suspect that the giant ram per task has been changed. Just after I got my Ram update installed darn it. :) Tom A proud member of the OFA (Old Farts Association). |
Grant (SSSF) Send message Joined: 19 Aug 99 Posts: 13855 Credit: 208,696,464 RAC: 304 |
I am beginning to suspect that the giant ram per task has been changed.It depends on the Task. Some need it, some don't. The current batch don't, the next ones may or may not. Grant Darwin NT |
Keith Myers Send message Joined: 29 Apr 01 Posts: 13164 Credit: 1,160,866,277 RAC: 1,873 |
Thanks Grant. I just picked up 3 WU's from Rosetta so I'll keep an eye on what happens. Much appreciated.Keep an eye on your BOINC Manager Event log. As well as roughly 1.3GB RAM per Task being run (although the present ones are only using between 300-750MB), it also needs plenty of HDD space, and if it doesn't have enough you'll have messages in the Event log telling you how much more is needed. I've got 12 Tasks running, with just over 11GB of HDD space in use. I agree, seems that happened to me also. Added 16GB and was using over 29GB of RAM on cpu tasks and then the tasks changed and now I am only using 50% of my 32GB of RAM. Oh well, should be covered whenever the mix of work changes again. Seti@Home classic workunits:20,676 CPU time:74,226 hours A proud member of the OFA (Old Farts Association) |
tullio Send message Joined: 9 Apr 04 Posts: 8797 Credit: 2,930,782 RAC: 1 |
My son has sent me a photo of a page of "Le Scienze", the Italian version of Scientific American, which speaks of SETI@home in correct terms. I imagine there is such a page in "Scientific American" but I don't have access to it. Tullio |
Gary Charpentier Send message Joined: 25 Dec 00 Posts: 31012 Credit: 53,134,872 RAC: 32 |
All depends upon Nebula now to give us second sift work to do.None for us from Nebula because of its nature. To do Nebula work you have to have a copy of the entire science database. What's that now, 24 multi TB drives? Don't think that is going through a consumer internet connection. Let it go and be happy you got to see the server closet. |
William R. Clune Send message Joined: 14 May 99 Posts: 48 Credit: 691,775,094 RAC: 943 |
All I can say is UGH... |
Hugo A. Durantini Luca Send message Joined: 24 Sep 09 Posts: 14 Credit: 7,253,842 RAC: 10 |
Processing one my last pending WUs and maybe failling short for a bit of reaching 7,250,000 credits, but happy to see the project moving to a new phase and of course will be paying atention to any newa. In the meanwhile Folding@home and Rosetta@home seems like a appropiated project given the current times, cheers people! |
Byron Leigh Hatch @ team Carl Sagan Send message Joined: 5 Jul 99 Posts: 4548 Credit: 35,667,570 RAC: 4 |
Thank you SETI@home, for last 21 years I have been crunching. futures generations of humans will continue the search :) The probability of success is difficult to estimate, but if we never search, the chance of success is zero. - Cocconi-Morrison, Searching for Interstellar Communications, Nature, September 19, 1959 SETI@home is the scientific experiment that most excites the imagination of people worldwide. Assignment of the Science is not to open a door to endless know, but to set a barrier to the endless ignorance. - Galileo Galilei by Bertolt Brecht I admire and greatly respect the Scientist at SETI@home - Berkeley University California USA. I'm a dreamer, but I'm not the only one - John Lennon ... Byron. |
Al Send message Joined: 3 Apr 99 Posts: 1682 Credit: 477,343,364 RAC: 482 |
Dang, haven't been paying close attn over the last couple months, and just took a look at my machines that I had set up as set and forget, as it seemed a bit cooler than normal in there, and noticed no tasks on any of them. I knew that didn't portend good things, so came here and found this thread. As others have said, I have a number of emotions right now, but overall glad that I had the chance to be a part of this for 21 years, it has been a damn good run if I say so myself.All depends upon Nebula now to give us second sift work to do.None for us from Nebula because of its nature. To do Nebula work you have to have a copy of the entire science database. What's that now, 24 multi TB drives? Don't think that is going through a consumer internet connection. Let it go and be happy you got to see the server closet. And yep, I am Very happy that I had the chance to see the server closet as well, it was a very memorable trip out there a couple years ago, thanks for having me. And thanks to the entire team for all the hard work and dedication over all these years, you have been amazing, and I am hoping to someday process more work here, once you've had a chance to analyze the huge pile that you currently have. Well, time to go back and shut down the systems for the last time. Sad stuff, but good memories... |
Jeff Send message Joined: 5 Jul 99 Posts: 2 Credit: 1,022,244 RAC: 3 |
Well, last of mine beamed up. Wanders off to see if there are projects for todays issue first. |
Tex1954 Send message Joined: 16 Mar 11 Posts: 12 Credit: 6,654,193 RAC: 17 |
And why does "analyzing the back end" of the results require the shutting down of incoming data?Money. There isn't enough of it to analyse the data already gathered, and keep sending out more to add to what's already been done. Send a letter to Bill Gates!!!!!! I'm positive he could fund you for a Nebula upgrade and keep SETI going for another 10 with his pocket change! 8-) |
mohavewolfpup Send message Joined: 20 Oct 18 Posts: 32 Credit: 3,666,574 RAC: 24 |
Is the server closet where the magic first started? Historian for the Defunct Riviera Hotel and Casino, Former Classic Seti@home user for Team Art Bell. Greetings from the High Desert! |
Terrh Send message Joined: 1 Jul 00 Posts: 2 Credit: 599,515 RAC: 30 |
So is this the "farewell" thread? I may not have a high "credit" score but that's because much of my CPU time is from the days when computers were much slower, and recently because I've been doing folding instead of seti. But I loved this project, started when I was just a teenager. Same reason why I have a ridiculous username. Only thing I did before this was the distributed.net SHA challenges back in the mid 90's when I was just a kid. I always loved this project, I'm sad they've run out of work for us to do. More telescope time is needed, so we can keep listening... someone is out there, we just have to find them. |
Tom M Send message Joined: 28 Nov 02 Posts: 5126 Credit: 276,046,078 RAC: 462 |
So is this the "farewell" thread? It is one of several "farewell" threads. Tom M A proud member of the OFA (Old Farts Association). |
ML1 Send message Joined: 25 Nov 01 Posts: 21229 Credit: 7,508,002 RAC: 20 |
Is the server closet where the magic first started? This is where it all began: Before the "Big Reorg" - May 12, 2002 A few years later, s@h was hosted on: Another Look at the Servers - December 22, 2008 Since then, the project outgrew the data bandwidth available to the lab, and then also outgrew the server closet itself. The main servers are now physically in a datacentre somewhere 'down the hill'. Anyone know of any more recent pictures online? Keep searchin', Martin See new freedom: Mageia Linux Take a look for yourself: Linux Format The Future is what We all make IT (GPLv3) |
rob smith Send message Joined: 7 Mar 03 Posts: 22535 Credit: 416,307,556 RAC: 380 |
At the time of the move to the CoLo centre a lot of folks asked for pictures and we were very politely informed that it was highly unlikely that any would be allowed - I think Richard was one of the more recent Setizens to visit, and from his description of the visit security is pretty tight (I'm not sure if they did an "all body cavities" search, but.....) Bob Smith Member of Seti PIPPS (Pluto is a Planet Protest Society) Somewhere in the (un)known Universe? |
Ju4n1t0 Send message Joined: 22 Nov 09 Posts: 1 Credit: 2,660,419 RAC: 0 |
:( |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14679 Credit: 200,643,578 RAC: 874 |
At the time of the move to the CoLo centre a lot of folks asked for pictures and we were very politely informed that it was highly unlikely that any would be allowed - I think Richard was one of the more recent Setizens to visit, and from his description of the visit security is pretty tight (I'm not sure if they did an "all body cavities" search, but.....)No, they didn't go that far. But I had to be introduced and signed up by somebody 'on the list' (Jeff Cobb was able to do that), wear a name badge, and be escorted. Eric let me take a camera in, on the strict understanding that I only took pictures of SETI's kit. This is one which I've posted before: Eric working on an upgrade to muarae2 - a twin to the server which drives these web pages. We were adding a PCIe card holding 4 M.2 HDDs, an extra 2 TB of storage if I remember right. We pulled the server out of the rack this side of the open door: behind the door you can see the 'crash cart' plugged into an access point to provide remote control of the server we needed to power down before removal. A better view of the 'crash cart' - I think Eric had decided he needed to connect directly by this point - remote access wasn't quite giving him what he needed. Eric replacing a failed drive in one of the storage arrays. The general 'mixed server' rack from which we removed muarae2 for the upgrade. Primarily a storage rack, including 'amigos' - the user-funded data storage server intended to hold Parkes data (next to bottom). The upper part of the rack holds Breakthrough Listen's storage arrays. And for old times' sake - the original closet in the Space Science Lab, where it all started - pictured on the same day, 19 July 2019. |
Ian&Steve C. Send message Joined: 28 Sep 99 Posts: 4267 Credit: 1,282,604,591 RAC: 6,640 |
i know you said they had issues with that server. do you know what issues specifically? it wasn't related to seeing all of the m.2 drives was it? to use that PCIe card, the server must support PCIe bifurcation. and you have to set the slot it's in to x4x4x4x4 for it to work. i can't tell how old that server is. looks like a Dell or HP 1U system of some sort. Seti@Home classic workunits: 29,492 CPU time: 134,419 hours |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.