Decided to Bail Out on SETI for a Short While

Message boards : Number crunching : Decided to Bail Out on SETI for a Short While
Message board moderation

To post messages, you must log in.

Previous · 1 · 2 · 3 · 4

AuthorMessage
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791254 - Posted: 28 May 2016, 13:29:11 UTC - in response to Message 1791253.  
Last modified: 28 May 2016, 13:32:38 UTC

Let's hope that the scientists - that means astronomers, mathematicians, and sociologists with experience of distributed computing - are allowed to get on with the prototyping at their own speed and with their own goals in mind, then.

We don't want more releases like v8 to a timetable set by financiers and marketing, which turned out not to be a timetable after all (because the same finance/marketing machine held back the data until the financier's name-day).

Thank you for saying that, Richard.
I am too much of a bandwagon banner flying kinda guy to tell the truth sometimes.
I had some communication with Eric about the release of v8 before the apps were ready.
Kinda angry about it, actually. Damned angry about it, actually.
Eric was angry about it as well, not just me.
And, as one of the higher contributors to the project, he took my anger very seriously.

I have dedicated years of my life and countless dollars to this search, and to have that slap in the face was rather rude.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791254 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1791276 - Posted: 28 May 2016, 14:30:28 UTC - in response to Message 1791254.  

Let's all go on a nice relaxing cruise for the duration, shall we?
ID: 1791276 · Report as offensive
Profile Zalster Special Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 27 May 99
Posts: 5517
Credit: 528,817,460
RAC: 242
United States
Message 1791282 - Posted: 28 May 2016, 14:53:05 UTC - in response to Message 1791276.  

Until they can keep people from getting NoroVirus, I prefer to stay off Cruise Ships.

No matter how short the trip, lol...

But vacations do sound like a good idea, just land based.
ID: 1791282 · Report as offensive
Profile Raistmer
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 16 Jun 01
Posts: 6325
Credit: 106,370,077
RAC: 121
Russia
Message 1791284 - Posted: 28 May 2016, 14:57:36 UTC - in response to Message 1791244.  
Last modified: 28 May 2016, 14:58:03 UTC


Einstein's Gravity Wave search uses much larger datapaks, and searches a much larger parameter space - the first confirmed gravity wave detection was at a distance of 1.3 billion light years, with all the de-dispersion range that implies (although the actual detection was done in-house, not via the BOINC application).

What de-dispersion? It's gravity, not electromagnetics.
Einstein at home just doesn't look in such frequencies and transient signals at all.
ID: 1791284 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791285 - Posted: 28 May 2016, 15:01:49 UTC - in response to Message 1791284.  


Einstein's Gravity Wave search uses much larger datapaks, and searches a much larger parameter space - the first confirmed gravity wave detection was at a distance of 1.3 billion light years, with all the de-dispersion range that implies (although the actual detection was done in-house, not via the BOINC application).

What de-dispersion? It's gravity, not electromagnetics.
Einstein at home just doesn't look in such frequencies and transient signals at all.

And Kittes@home just look at mices...........
My kitties just keep on looking at the stars.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791285 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1791287 - Posted: 28 May 2016, 15:06:18 UTC - in response to Message 1791284.  

So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity.
ID: 1791287 · Report as offensive
Profile Raistmer
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 16 Jun 01
Posts: 6325
Credit: 106,370,077
RAC: 121
Russia
Message 1791288 - Posted: 28 May 2016, 15:09:12 UTC - in response to Message 1791253.  

Let's hope that the scientists - that means astronomers, mathematicians, and sociologists with experience of distributed computing - are allowed to get on with the prototyping at their own speed and with their own goals in mind, then.

We don't want more releases like v8 to a timetable set by financiers and marketing, which turned out not to be a timetable after all (because the same finance/marketing machine held back the data until the financier's name-day).


Time-table was fulfilled perfectly. Cause to the time of data release we had working app on main and fully functional apps for release on beta.
Having data w/o apps would be much worse.
There is nothing actually that was made "in hurry" to 12 April. All was done and tested well.
The single side effect is that v9 will be needed for new processing types introduction. But even that actually good part. It's not too good to introduce 2 big changes simultaneously. So we have new data sources first and will have new processing algorithms later.
ID: 1791288 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791291 - Posted: 28 May 2016, 15:14:01 UTC

Yikes...v9???

The kitties have not even recovered from v8 yet......................

LOL...bring it.

Meow.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791291 · Report as offensive
Profile Raistmer
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 16 Jun 01
Posts: 6325
Credit: 106,370,077
RAC: 121
Russia
Message 1791293 - Posted: 28 May 2016, 15:20:04 UTC - in response to Message 1791287.  

So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity.

Would I be project scientist - perhaps. I will know design decisions when they will be formed. From common sense to get improvement over Arecibo data longer time periods can be used for signal accumulation so longer initial arrays for FFA. Smth like large FFA or bigger. Targeted search in general allows longer time domain for signal accumulation that results in better signal/noise ratio => increased sensitivity on the same hardware.
ID: 1791293 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791295 - Posted: 28 May 2016, 15:25:14 UTC - in response to Message 1791293.  
Last modified: 28 May 2016, 15:27:57 UTC

So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity.

Would I be project scientist - perhaps. I will know design decisions when they will be formed. From common sense to get improvement over Arecibo data longer time periods can be used for signal accumulation so longer initial arrays for FFA. Smth like large FFA or bigger. Targeted search in general allows longer time domain for signal accumulation that results in better signal/noise ratio => increased sensitivity on the same hardware.

If longer tasks increase our chance of finding our needle in the cosmic haystack, I am all for it.
I do not care if the apparent result is me crunching one WU per day.

If that one WU is more productive than tossing ten a minute out the window, so be it.

I have long ago said that I luv my credits, but that is NOT why I am here.

I am here for the freaking SCIENCE of it, and that is what attracted me to the project in the first place, and that is what keeps me here.

And the fact that things are progressing here, rather that the same old...
Makes me even more determined to participate in this amazing project.

Meow!
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791295 · Report as offensive
Profile jason_gee
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 24 Nov 06
Posts: 7489
Credit: 91,093,184
RAC: 0
Australia
Message 1791305 - Posted: 28 May 2016, 15:53:47 UTC - in response to Message 1791287.  

So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity.


Anything can be broken up into sections. First thing I'd do if such an animal materialised, would be to thread it. Only reason I haven't made Cuda MB properly threaded is the tasks have been too 'small' to get good scaling (so we run multiples instead). That's quite likely to change amidst our various plans to tackle Guppis from different development directions.

Been a while since I looked at or even thought about AP code, but you can assume that larger dedispersion on a given (same sized) dataset increases the effective range. Before adding any observation length/number of points, or extra searches there's an inverse square law in there somewhere (iirc), sothere is non-linear growth in processing demand.

In the case of increasing the chunk sizes, still same observations lengths, (to end up with more resolution) then the number of coadds, off the top of my head, would multiply. As high as 2M length transforms, then the FFT based convolution operations each of 4nlogn complexity or so, become pretty costly compared to regular 32k, and probably not many caches would handle that nicely (especially single threaded). Typical communications complexity cost for thrashing cache (as power of 2 transforms naturally do as they grow) increases by an order of magnitude for each cache level filled.

Short version taking things to extremes, so 10 hours becoming 10,000 hours ( ~416 days ), doesn't seem out of the realm of possibility, with no increase in payload.
"Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions.
ID: 1791305 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791309 - Posted: 28 May 2016, 16:00:12 UTC
Last modified: 28 May 2016, 16:01:23 UTC

Might I be so bold as to ask in public, what do you do for a living, Jason?

Reply by PM if you would rather not here, but I think your understanding and statements regarding mathmatical things astound me, and so, do you just drive a truck to let things stew?

That would certainly explain your understanding of 'chunk sizes'.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791309 · Report as offensive
Profile jason_gee
Volunteer developer
Volunteer tester
Avatar

Send message
Joined: 24 Nov 06
Posts: 7489
Credit: 91,093,184
RAC: 0
Australia
Message 1791312 - Posted: 28 May 2016, 16:06:12 UTC - in response to Message 1791309.  

Might I be so bold as to ask in public, what do you do for a living, Jason?

Reply by PM if you would rather not here, but I think your understanding and statements regarding mathmatical things astound me, and so, do you just drive a truck to let things stew?

That would certainly explain your understanding of 'chunk sizes'.


That's complicated, will PM in a bit, lol
"Living by the wisdom of computer science doesn't sound so bad after all. And unlike most advice, it's backed up by proofs." -- Algorithms to live by: The computer science of human decisions.
ID: 1791312 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1791350 - Posted: 28 May 2016, 17:33:13 UTC - in response to Message 1791295.  

And Kittes@home just look at mices...........
My kitties just keep on looking at the stars.

My fuzzy minions waste their time watching movies all day.
http://i.imgur.com/IvWLL9K.jpg

So, can you explain better what a 6-month GBT task would look like (download data size, perhaps), what mathematics would be performed, and why it can't be broken down into smaller sub-sections? That's asked out of genuine curiosity.

Would I be project scientist - perhaps. I will know design decisions when they will be formed. From common sense to get improvement over Arecibo data longer time periods can be used for signal accumulation so longer initial arrays for FFA. Smth like large FFA or bigger. Targeted search in general allows longer time domain for signal accumulation that results in better signal/noise ratio => increased sensitivity on the same hardware.

If longer tasks increase our chance of finding our needle in the cosmic haystack, I am all for it.
I do not care if the apparent result is me crunching one WU per day.

If that one WU is more productive than tossing ten a minute out the window, so be it.

I have long ago said that I luv my credits, but that is NOT why I am here.

I am here for the freaking SCIENCE of it, and that is what attracted me to the project in the first place, and that is what keeps me here.

And the fact that things are progressing here, rather that the same old...
Makes me even more determined to participate in this amazing project.

Meow!

I believe one of the issues the project tries to balance is allowing as many host to participate as possible.
Only having much larger, or complex, workunits could raise the minimum machine specifications. It might even cause a significant portion to no longer be able to contribute. Offering both types of work would allow a large number of hosts to participate & have more complex data analyzed by hosts that are capable. However that is more work on the back end.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1791350 · Report as offensive
Al Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Avatar

Send message
Joined: 3 Apr 99
Posts: 1682
Credit: 477,343,364
RAC: 482
United States
Message 1791370 - Posted: 28 May 2016, 17:56:16 UTC - in response to Message 1791350.  

HAL, question for you. Since our 'benefactor' made his (very large) contribution to this science en devour, why hasn't (at least as far as my limited knowledge of it all goes) any of those resources migrated toward where they can do the most good long term for everyone, and the project as a whole - the back end (meaning the software that we all use to process all the work that the donation has provided)?

I'd think that with the kind of dollars I have heard mentioned previously, there would be a percentage, let's say maybe 3-5% of it that could be strictly dedicated to making the basic program be the best it can be? Which may include hiring some of the people who have been so kind to volunteer their time to this project so generously (if they are interested, of course), otherwise hiring full time, highly qualified programming staff for it (maybe poach a programmer or 2 from Nvidia and ATI? Bwhahaha!). This would enable us to be able to efficiently run the latest work units that said contribution has provided us in such large quantities?

From that would flow much good science, as well as long term (relative, as of course hardware and O/S'es change over time) stability, along with the ability to maybe work with said vendors in a serious way (ATI/Nvidia) so we might be able to plan somewhat our path, and not be blindsided by the changes and have to scramble madly to get the product to work properly. Maybe this is all pie in the sky, but with the proper resources, directed the proper way, I think it should be doable?

ID: 1791370 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1791388 - Posted: 28 May 2016, 18:11:52 UTC - in response to Message 1791370.  

HAL, question for you. Since our 'benefactor' made his (very large) contribution to this science en devour, why hasn't (at least as far as my limited knowledge of it all goes) any of those resources migrated toward where they can do the most good long term for everyone, and the project as a whole - the back end (meaning the software that we all use to process all the work that the donation has provided)?

I'd think that with the kind of dollars I have heard mentioned previously, there would be a percentage, let's say maybe 3-5% of it that could be strictly dedicated to making the basic program be the best it can be? Which may include hiring some of the people who have been so kind to volunteer their time to this project so generously (if they are interested, of course), otherwise hiring full time, highly qualified programming staff for it (maybe poach a programmer or 2 from Nvidia and ATI? Bwhahaha!). This would enable us to be able to efficiently run the latest work units that said contribution has provided us in such large quantities?

From that would flow much good science, as well as long term (relative, as of course hardware and O/S'es change over time) stability, along with the ability to maybe work with said vendors in a serious way (ATI/Nvidia) so we might be able to plan somewhat our path, and not be blindsided by the changes and have to scramble madly to get the product to work properly. Maybe this is all pie in the sky, but with the proper resources, directed the proper way, I think it should be doable?


I haven't really been following everything related to this large funding contribution.

However I know there is a lot more going on behind the scenes than we observe directly or get notifications.
The amount that may be directed to SETI@home specifically is likely VERY small. I'm not sure that SETI institute itself even had that large of a chunk.
I think most of the money was said to be going to dedicated telescope time globally.
Also there may be limitations to how the money can be allocated.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1791388 · Report as offensive
Richard Haselgrove Project Donor
Volunteer tester

Send message
Joined: 4 Jul 99
Posts: 14650
Credit: 200,643,578
RAC: 874
United Kingdom
Message 1791412 - Posted: 28 May 2016, 18:38:42 UTC - in response to Message 1791388.  

If you read between the lines of Matt's Technical News post a couple of weeks ago, I'd say that the benefactor's contribution to SETI@Home's general funding so far has been negative - and substantially so. Much extra work, but no extra bodies or brains to do it.

To my mind, that would violate the UK concept of Full Cost Recovery (FCR) - but that's a local funding standard, not a global one.
ID: 1791412 · Report as offensive
Profile HAL9000
Volunteer tester
Avatar

Send message
Joined: 11 Sep 99
Posts: 6534
Credit: 196,805,888
RAC: 57
United States
Message 1791419 - Posted: 28 May 2016, 18:49:27 UTC - in response to Message 1791412.  

If you read between the lines of Matt's Technical News post a couple of weeks ago, I'd say that the benefactor's contribution to SETI@Home's general funding so far has been negative - and substantially so. Much extra work, but no extra bodies or brains to do it.

To my mind, that would violate the UK concept of Full Cost Recovery (FCR) - but that's a local funding standard, not a global one.

I read that mostly as "Breakthrough Listen is funding Matt's paycheck so we haven't lost him" at the time.
SETI@home classic workunits: 93,865 CPU time: 863,447 hours
Join the [url=http://tinyurl.com/8y46zvu]BP6/VP6 User Group[
ID: 1791419 · Report as offensive
kittyman Crowdfunding Project Donor*Special Project $75 donorSpecial Project $250 donor
Volunteer tester
Avatar

Send message
Joined: 9 Jul 00
Posts: 51468
Credit: 1,018,363,574
RAC: 1,004
United States
Message 1791420 - Posted: 28 May 2016, 18:53:55 UTC

I got things to say/
I ain't done yet/.
"Freedom is just Chaos, with better lighting." Alan Dean Foster

ID: 1791420 · Report as offensive
Previous · 1 · 2 · 3 · 4

Message boards : Number crunching : Decided to Bail Out on SETI for a Short While


 
©2024 University of California
 
SETI@home and Astropulse are funded by grants from the National Science Foundation, NASA, and donations from SETI@home volunteers. AstroPulse is funded in part by the NSF through grant AST-0307956.