// DBOINCP-300: added node comment count condition in order to get Preview working ?>
Anonymous
9 Sep 2006 14:18:30 UTC
Topic 13449
(moderation:
)
I think it's the new LDAS cluster (LIGO Data Analysis System) there that now runs E@H as a backfeed job, i.e. if the nodes are doing nothing else. IIRC it has about 1200 Opteron cores.
It will be interesting to see if they come close to beating Bruce Allen one day ;-)
Doesn't the Nemo cluster have 1280 cores? It's gonna be a close race ;)
We are getting another 140 nodes for our cluster (280 cores). So we'll be up to 1560 cores in another month or so. You can track the usage of all the LSC computing clusters at this Ganglia site.
By the way, we are about to start using Nemo for doing some of the post-processing analysis from the Einstein@Home S4 run that finished early this summer. In general I expect that as the S5 run progreses, these LSC computing clusters will be more and more heavily used for other types of analysis, and their Einstein@Home 'bottom feeder' contributions will shrink.
Just out of interest: Do you have a rough estimate how long the S4 postprocessing will take?
Weeks to months. Just in the past week we have finished a first postprocessing pass through the S4 results. But now we need to do some more tweaking and tuning, and then some followup studies. This is one of those activities where it seems as if the last 10% of the job takes 90% of the effort.
Actually it's worse than that - I've been doing scientific monitoring shifts at LIGO Hanford for the past six days, from 22:00 - 08:00 so I now sleep for most of the day!
LIGO Laboratory at Caltech
)
I think the Caltech cluster is slighly more powerful than Nemo, but apparently more busy with other jobs right now. Might become a close race.
BM
BM
RE: RE: It will be
)
We are getting another 140 nodes for our cluster (280 cores). So we'll be up to 1560 cores in another month or so. You can track the usage of all the LSC computing clusters at this Ganglia site.
By the way, we are about to start using Nemo for doing some of the post-processing analysis from the Einstein@Home S4 run that finished early this summer. In general I expect that as the S5 run progreses, these LSC computing clusters will be more and more heavily used for other types of analysis, and their Einstein@Home 'bottom feeder' contributions will shrink.
Cheers,
Bruce
RE: Just out of interest:
)
Weeks to months. Just in the past week we have finished a first postprocessing pass through the S4 results. But now we need to do some more tweaking and tuning, and then some followup studies. This is one of those activities where it seems as if the last 10% of the job takes 90% of the effort.
Cheers,
Bruce
RE: And Dr. Bruce even
)
Actually it's worse than that - I've been doing scientific monitoring shifts at LIGO Hanford for the past six days, from 22:00 - 08:00 so I now sleep for most of the day!