Message boards :
Number crunching :
Bottlenecks of SETI computations
Message board moderation
Author | Message |
---|---|
Dan W Send message Joined: 25 Jun 09 Posts: 7 Credit: 73 RAC: 0 |
Hi all, I've just recently joined seti@home, and I have a few questions in regards to number crunching side of things and long term plans of SETI... Since SETI is always collecting data, is the distributing project keeping up with all the data, or is it lagging behind in the hope of the growth of exponential CPU power (multi-core etc.). If the former, then how big is the lag between the general analysing of data by users compared to the when the data is initially collected? Is data collected in June 2009, analysed by users in June 2009? If the latter, then how big is the growth of the growth of the lag? (just rough approximates would be interesting). Ideally, how much more resolution would SETI like users to analyse at if CPU power were no barrier? Is there any chance of increasing the resolution at some point in the future? I don't see these questions answered on the main website, but it's always interesting to know the limits of computation, and how much more CPU power we really need for the project to reach saturation point with no more CPU bottlenecks. |
1mp0£173 Send message Joined: 3 Apr 99 Posts: 8423 Credit: 356,897 RAC: 0 |
Hi all, Until recently, SETI has been more or less keeping up with the incoming flow (with maybe a three to six month lag between recording and analysis). The flow of data isn't constant for a number of reasons, including problems (currently) with the data recorder, and maintenance at the telescope, or the planetary radar. We're currently working on data recorded a while ago, from the archives. It's also possible for the project to go back and do more sensitive analysis and recycle all that old data all over again: there is a lot of older data that has been through the Multibeam search (for narrowband signals) but not Astropulse (wideband signals). It is also true that signals recorded last week may have spent decades (or centuries) getting to the telescope, so being a year or two behind isn't any big deal. On the other side of the project, one goal is to demonstrate that BIG science can be done on a vanishingly small budget. SETI is often struggling from extra load caused by either "fast" work or outages. We're in one of those periods right now. For many of us, it's no big deal. SETI (and BOINC) are supposed to harvest waste CPU cycles, and we have computers that are on anyway, so no worries. Others have built special-purpose machines just for SETI, and they find any interruption in the flow of work to be very upsetting. So, welcome. You came at an interesting time. Have a seat, buckle in, put on your helmet and enjoy the ride. I am. |
Dan W Send message Joined: 25 Jun 09 Posts: 7 Credit: 73 RAC: 0 |
Cheers Ned for the informative reply! Assuming a 'keeping up with the data' scenario, ideally, do you know how much extra CPU power would be ideal if we wanted to analyze the data at an ideal resolution? Would say... 1000x let us achieve everything we would want? Or more/less than that maybe? |
Josef W. Segur Send message Joined: 30 Oct 99 Posts: 4504 Credit: 1,414,761 RAC: 0 |
Cheers Ned for the informative reply! Leaving aside practicalities, the kinds of analysis we're doing on the data could be improved perhaps 4 to 10x given additional compute resources. Beyond that, it would make sense to consider different kinds of analysis rather than idealizing what we have. Considering practicalities, the limiting factors are getting the data back from the telescope and getting it out to participant's computers. The true ideal from my point of view? Hmm, the ALFA array on the Arecibo telescope sees about 1 part in 3 million of the sky at any instant, the array is only in use 25 to 30 percent of the time, S@H records less than 1% of the bandwidth of the ALFA system (which is only a small fraction of the electromagnetic spectrum anyhow), etc. A major increase in any of those limitations would be very welcome to me. I don't seem to be able to really conceive an 'ideal' limit, though it would probably be possible to think along the lines of what could happen if every country in the world committed its 'defense' budget to SETI rather than buying weapons... Joe |
HAL9000 Send message Joined: 11 Sep 99 Posts: 6534 Credit: 196,805,888 RAC: 57 |
Cheers Ned for the informative reply! Welcome to the wonders of making your computer look for aliens. I think ideally how much CPU power SETI needs is located here. The whole list if you please. However, They do not have the budget for that. So we kindly let them use our idle computers to make a giant global supercomputer. If you Visit the Server Status page here. You can see the current status of the tape splitters. Looks like some Aug. '08 tapes are in there at the moment. SETI@home classic workunits: 93,865 CPU time: 863,447 hours Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[ |
John McLeod VII Send message Joined: 15 Jul 99 Posts: 24806 Credit: 790,712 RAC: 0 |
Cheers Ned for the informative reply! And the entire sweep available to the Aricebo telescope is only a few percent of the total sky. BOINC WIKI |
Josef W. Segur Send message Joined: 30 Oct 99 Posts: 4504 Credit: 1,414,761 RAC: 0 |
... About 32% but that's hard to see on the usual cylindrical projection sky map. Joe |
1mp0£173 Send message Joined: 3 Apr 99 Posts: 8423 Credit: 356,897 RAC: 0 |
Cheers Ned for the informative reply! ... or how about the money wasted by the typical marketing department at any of the Fortune 1000 corporations trying to convince their customers to be happy instead of actually doing things to make their customers happy? Sorry, but I've been trying to convince a major bank to give me the local number for their branch, and the best I've been able to get is one with a similar name 400 miles away. They'd make the gang at the Sirius Cybernetics Corporation proud. |
Raistmer Send message Joined: 16 Jun 01 Posts: 6325 Credit: 106,370,077 RAC: 121 |
On the other side of the project, one goal is to demonstrate that BIG science can be done on a vanishingly small budget. Hope this goal will not be seen by bureaucracy. Or this approach (small budget for big science) will be implemented in other areas too ;) |
1mp0£173 Send message Joined: 3 Apr 99 Posts: 8423 Credit: 356,897 RAC: 0 |
On the other side of the project, one goal is to demonstrate that BIG science can be done on a vanishingly small budget. The alternative is worthwhile science that is not done because no one will fund it. Like so many things in life, this is a trade-off. We either have to accept less being done for more money, or we have to make "more for less" possible. BOINC means than an individual can actually "fund" a project out of his own pocket. A small project can be run on servers that happen to be on-hand, using an inexpensive DSL line. IIRC, HashClash was run by a graduate student, with no outside funding. |
Dan W Send message Joined: 25 Jun 09 Posts: 7 Credit: 73 RAC: 0 |
Wow, thanks all for the input!
So if I were to make the calculation 10 * 3,000,000 * 3 * 100 * 50? = 450,000,000,000........... (the "50" number is my guess at the figure for the "small fraction of the electromagnetic spectrum" - is it about right?). Also, the 10 number at the beginning is from your first paragraph. Perhaps that needs increasing an order of magnitude for 'as-good-as-we-can-get' analysis?) So assuming our telescope records all possible sky all the time, am I right in saying that gargantuan number (450 billion) gives us a very approximate idea of the ideal increase in CPU speed we would need to exhaustively process (and keep up with) all possible data? :) |
Ianab Send message Joined: 11 Jun 08 Posts: 732 Credit: 20,635,586 RAC: 5 |
I'm not sure on your exact numbers, but enough CPU power to scan the whole sky continuosly would be a BIG number - more than all the processors in the world I would guess. Of course you could scan the whole sky at a lower resolution, and thats done to look for BIG 'noisy' things like supernova etc. What SETI is looking for is low level signals that could be coming from an intelligent source. That signal would be mixed up with all the natural background radio noise. Hence needing the detailed analysis do find any signal that might be there. Looking that closely at small portions of the sky is all that it's practical to do right tnow. Looking at the whole sky you dont have enough sensitivity to find what we are looking for even if it was captured. In |
Dan W Send message Joined: 25 Jun 09 Posts: 7 Credit: 73 RAC: 0 |
From this page, SETI analyzes between 1418.75 MHz to 1421.25 MHz. I wonder what the lowest and highest it could go to to cover all bases. There's also the confusion about whether higher/lower frequencies would take quicker/longer to inspect, all else being equal. |
1mp0£173 Send message Joined: 3 Apr 99 Posts: 8423 Credit: 356,897 RAC: 0 |
From this page, SETI analyzes between 1418.75 MHz to 1421.25 MHz. I wonder what the lowest and highest it could go to to cover all bases. This is called the "water hole" -- it is a quiet area of spectrum between the hydrogen line and the hydroxyl line. One story says this was chosen because everyone "gathers around the water hole" although I suspect the truth is that it's a quiet spot, and as good a place as any. The radio spectrum goes from DC to Light -- something below 10 KHz (0.01 MHz) to over 300 GHz (300,000 MHz). Lower frequencies have less data, and should take less time to analyze. Edit: It isn't practical to design a single receiving system that covers 0.01 MHz to 300,000 MHz. |
Dan W Send message Joined: 25 Jun 09 Posts: 7 Credit: 73 RAC: 0 |
I see. So hypothetically (given unlimited telescope resources) would it make more sense to analyze in fixed linear steps or logarithmic steps? So either: 0.01 MHz ... 2.51 ... 5.02 ............ 299,995... 299,997.5 ...300,000 MHz. (the 2.5 comes from 1421.25 - 1418.75) or more like... (logarithmic style) 0.01 MHz... 0.010017... 0.010035......... 298,982... 299,490... 300,000 Mhz. (the factor of 1.0017 comes from 1421.25 / 1418.75) From what you said, the former seems to make more sense. Anyway, if it's the former, then it's going to be around 120,000 times slower (the latter is more like 10,000x slower). Would that sound about right to exhaustively analyze the entire useful spectrum? That brings our giant number up to 5*10^16. Well that *is* big, but I'm fairly optimistic Moore's law will be sparked off again (if only because so many people would like to see full global illuminated raytraced graphics in games including proper physics with billions of particles). What might help reduce it somewhat further is if we could get telescopes on the far side of the moon. I'm guessing that the lack of atmosphere there may ease the computational algorithms...(?) ******POST EDITED FOR NUMBERS****** |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14679 Credit: 200,643,578 RAC: 874 |
What might help reduce it somewhat further is if we could get telescopes on the far side of the moon. I'm guessing that the lack of atmosphere there may ease the computational algorithms...(?) Wouldn't help the computational algorithms, but it would sure help to block the interference from all those cellphones and "I love Lucy". |
ML1 Send message Joined: 25 Nov 01 Posts: 21235 Credit: 7,508,002 RAC: 20 |
The radio spectrum goes from DC to Light -- something below 10 KHz (0.01 MHz) to over 300 GHz (300,000 MHz). What sort of sensitivities could be achieved down at the very low frequencies? Is there less interstellar background noise so that a signal can be detected at a greater range?... Or could we achieve a greater detection range due to easier/better electronics and analysis/processing? Keep searchin', Martin See new freedom: Mageia Linux Take a look for yourself: Linux Format The Future is what We all make IT (GPLv3) |
DJStarfox Send message Joined: 23 May 01 Posts: 1066 Credit: 1,226,053 RAC: 2 |
What might help reduce it somewhat further is if we could get telescopes on the far side of the moon. I'm guessing that the lack of atmosphere there may ease the computational algorithms...(?) True, but you'd just have to build it to be gamma ray burst and solar flare resistant. :) |
Richard Haselgrove Send message Joined: 4 Jul 99 Posts: 14679 Credit: 200,643,578 RAC: 874 |
What might help reduce it somewhat further is if we could get telescopes on the far side of the moon. I'm guessing that the lack of atmosphere there may ease the computational algorithms...(?) What happens if ET aims a gamma ray burst at it - do you get the detection, or harden against it? :-) |
1mp0£173 Send message Joined: 3 Apr 99 Posts: 8423 Credit: 356,897 RAC: 0 |
The radio spectrum goes from DC to Light -- something below 10 KHz (0.01 MHz) to over 300 GHz (300,000 MHz). The main use of very low frequency transmissions has been communication with ballistic missile submarines -- the sub can simply trail a really long wire for the antenna (lower frequency, longer wavelength, bigger antenna) and VLF will penetrate water. The signalling speed is very very slow, barely enough to tell a sub on patrol to come up to periscope depth and use SATCOM which is quite fast. P.S. the other advantage to lower frequencies is that they tend to follow the ground instead of just shooting off into space. A bit higher (3 to 30 MHz or so) signals tend to go up, and with any luck, be bent back by the ionosphere, that's how international broadcasting and amateur radio ops. can cover the whole world. Above 30 MHz, signals tend to go in a straight line, unless they hit something opaque or highly absorbent on that frequency. |
©2024 University of California
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.