Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Slashdot.org News

Seti@Home Now Has Teams 196

Madoc writes "Was just over at Seti@Home's site, and saw that they've introduced teams now! There are 2 Slashdot teams, we should probably standardize on one: Slashdot.org and Team Slashdot " I vote for Team Slashdot. Go seek out new intelligence if this rocks your boat better than cracking DES keys.
This discussion has been archived. No new comments can be posted.

Seti@Home Now Has Teams

Comments Filter:
  • Actually, you can participate in the 23-mark Optimal Golomb Ruler search already, and you can even help find good starting points for distributed.net's future OGR searches (when that happens):

    http://members.aol.com/golomb20 [aol.com]
  • by Anonymous Coward
    After the government stop funding SETI several years ago they project almost died. Now a bunch of research have continued the work but they don't have access to all that goverment cash. The SETI@home is a way to help them get the computing power and cash from interested parties.
  • by Anonymous Coward
    There's also TkSeti, requires Tcl/Tk 8.0. Look for it on Freshmeat.
  • by Anonymous Coward
    I think it's one of those Quad Xeons
    running Windows NT server. They always
    to be so blisteringly fast.

    Actually you can run SETI@home on several
    machines provided you have the same login
    on each machine (see the SETI@home FAQ)
  • by Anonymous Coward
    The NSA may have all the horsepower in the world,
    But that's not the point

    What we're saying is that the NSA doesn't
    nessessarly have a log of systems administrator
    snitches out there, and cracking into 1000's of systems
    is a lot more work than just getting some zelous geeks
    to install some software on thier own systems that
    will give them the "backdoor" into them to snoop around
    automagically. Then encript the data for them alone to analyze at thier leasure.
  • by Anonymous Coward
    Why is it so easy to find people's passwords and add them to your group?


    I wonder why team slashdot has all the top users that have their email visible?


    holy loophole batman

  • by Anonymous Coward
    I see that Micheal Dolan has been properly removed from team slashdot. They need to fix this damn teams exploit... I mean, using the key in the userinfo for the password is just DUMB.
  • by Anonymous Coward
    Yes, it would be nice to finish one contest before moving on to the next, but look at the RC5-64 stats! It's been running for 577 days and we're only through 8.5% of the keyspace. Now I know, we're cracking at speeds faster that ever before, but still, it's proven to be fairly strong crypto and all we're doing now is wasting distributed CPU power which can be used for much more usefull projects. I've personally switched over to SETI@Home ans I've noticed a lot more press about it than the RC5 contest. The reason? It impacts a lot more people than encryption. How many people care about strong encryption. Ok, now how many people have stared up at the stars wondering if we're alone in the universe?

  • by Anonymous Coward
    One of my (two) biggest complaints with the seti@home project thus far is that most aspects of it have been under-engineered. Not enough time has been put into developing any aspect of it, save possibly for the cruncher algorithms themselves. Distributed.net did a far better job of delivering clients for platforms, assembling a good server-side package, proxying of many sorts, copious configurability, SMP and CPU-specific processing cores, etc.

    All that said, the SETI project does stand to yield something more useful, at least psychologically -- the time spent beating on RC5 has mainly (and successfully) demonstrated that DES-56 sucks, and that bigger keys are vastly harder to break. If there came another rapid DES-breaking project such as DES-II or DES-III, I'd happily switch my spare CPU cycles back to it for a day or two.

    Also, the source to seti@home isn't available, a problem which they have yet to rectify. If they desperately need to protect the algorithm for scientific integrity, they can move all that to a library and open the rest of the source so that we can fix the missing parts.
  • Wow, just imagine the implications... Joe Celery's computer receives the block, the one block that has a transmission from intelligent life forms. It's a hot day outside, and his "kewl 504A" is just a tiny bit hotter than when he left it running the Unreal timedemo all night to see if the chip was stable. Due to a random heat-related glitch, his computer mistakenly reports the block as not containing anything interesting, so it goes unchecked.

    Five years later, an alien demolition team wipes out the entire Milky Way to make room for an interstellar frontage road, a procedure that they had advertised (via radio beacon) for millenia. All life on Earth perishes because of an overclocked 300A, whereas if Joe Celery had not overclocked his chip, humanity would have made first contact with an alien race and Joe Celery's name would go down in the history books.

    Now wouldn't that just suck?
  • Wanna know how do do this? It's so easy it's insane.

    Let's say you want to add Michael Dolan (top individual user by miles) to your team. You get his email address (which is clearly displayed in the top 100 individual list) and change the logon in your Winbloze client to this email address. The user info in the client display will now tell you that you are Michael Dolan.

    Now look in the program directory for "user_data.txt" or some such. Open it up and look for the "key" value. This is Michael Dolan's password! Bingo, you now have his email and his "password" - add him to any team you like, or all of them if you want! Wanna have more fun, add the top Big companies to a team, or Berkeley themselves! Ha ha!

    Distributed password in plain text format? How STUPID is that?
  • by Anonymous Coward on Sunday May 23, 1999 @04:26PM (#1881815)
    Two points:
    1. The statistics for CPU time for Windows 9x include the time it is staying minimized in a System Tray doing nothing.

    2. To speed up processing time in Windows 9x dramatically (about 3x in my experience), turn on screen blanking in the screen saver properties.
  • by Anonymous Coward on Sunday May 23, 1999 @01:50PM (#1881816)
    Account maintenance should be finished soon.
    -Peter of the SETI@home team
  • by Anonymous Coward on Sunday May 23, 1999 @05:44PM (#1881817)
    Run it in screensaver mode and configure it to blank the screen. It goes 3 times faster when it's not bothering to draw the pretty pictures.
  • The Windows client has two settings, run as screensaver or when the pretty window is frontmost, or always run.

    I've left the window up, but not frontmost and it still is displaying graphics, so who knows how reliable that is. But I think when it's minimized it doesn't run, or is terribly slowed down.
  • You've been reading HHGTTG. Funny books (I thoroughly enjoyed them), but come on! Any race that could do such damage would just waltz on in and do it without bothering to fill out council forms. If the council got snarky, they'd wind up being scheduled from demolition as well.
  • Well, I worked for Cray (well, SGI now) in Chippewa Falls over the summer, and I saw a mailing label for a T3E part addressed with nothing more than a name and a zip code.

    Now, I'm not positive, but I think that zip code probably cooresponds to a certain base (fort?) in Maryland.

    ----

  • I seriously doubt that NSA needs you or me to bust open a few keys. Consider:

    - A $60,000 machine built by the EFF beat out all
    the King's horses and all the King's men
    (otherwise known as distributed.net).

    - The NSA probably would have considered Deep
    Crack (the EFF's key buster) a keen and useful
    computer -- twenty years ago.

    So, unless you've got some really serious reason to think otherwise, I'd stop worrying about a few bits from SETI, take my medication and start looking for little green men like a good little member of the Collective. Besides, there are better things out there to worry about, like the war in Kosovo or a 1 cent increase in the price of a stamp.

    ----

  • Be interesting to find out. What I do know is that NSA has their own fab; no telling what kind of goodies come out of there. ;)
  • My common sense tells me that if someone wants to send back falsified results, they'll send back falsified results. It's been shown time and time again that OBSCURITY != SECURITY. Just because the SETI project is closed doesn't mean that their results are not being falsified at this very moment, because it just takes a little more dedication to screw it up. The only really secure protocol/program/anything-else is one that's been peer-reviewed and shown that it's secure, which means things like checksums, encryption and accountability. It's my opinion that the SETI program is in fact more vulnerable to cracker efforts because it's closed -- a vulnerability in the system, once found, will probably not be brought to light before the results are completely and horribly skewed - after all, if they don't give us the code to review, why tell them when there's a problem in it?
  • It's not impossible that we are alone in the Universe.

    It's also not impossible that all the air molecules in the room will jump against one wall, leaving you in a vacuum to suffocate. However, the odds are against it, and most of us firmly believe the odds are against their being only one intelligence in the universe.


    ...phil
  • They really need to come out with a Personal Proxy like what Distributed.Net has. That way you can get cool stats like this [maine.edu]!

    If they make one, third party stats scripts will come. I promise it.
  • Posted by GothstaiN:

    InET is participating in the SETI@Home project. We Think that SETI is the most important activity of the last months, because is'nt only about the people colaboration for an "invented" project...is the people colaboration for a very USEFUL project.

    Imagine that human race discover an extraterrestrial civilization...With the people's work SETI is very near of that purpose.
  • Posted by Synsthe:

    Maybe I'm missing something terribly important about them, but I don't see anything at all important about finding large prime numbers.

    Somebody shed a little light on this, perhaps?
  • Posted by Matt Bartley:

    My beige PowerMac G3 at 266 MHz with 32 megabytes of memory just took a week to complete its first work unit, and the CPU time counter racked up "only" around 70 hours. Something is badly wrong.

    In addition, when I tried to have it contact the server this morning, it managed to send its results back, but then refused time and time again to retrieve any new data. I switched it back to guessing RC5 keys, at 850 kkeys/sec.

    By comparison, my Linux box, a 300 MHz AMD K6-2 with 96 megabytes of memory, goes through work units in about 16 hours.

    I think I'm using the i386-glibc2.1 binary on there. Should I try to the i686 binary instead? I don't understand which processor model (386, 486, Pentium, PPro) is right for the K6-2.

  • Posted by Matt Bartley:

    I had the same problem with my PPC. I found that if I disabled it as a screensaver, moved the app out of the system folder, and just started it normally when I was ready to let the system idle, it counted time normally.
    Other than when I fire up Quicken [intuit.com], I've let it run as the foreground (and only) application and turned off all screensavers and other power-down features. I haven't tried moving it out of the System folder though. I'll try that.
  • Posted by Kallahar2:

    search for XSETI, it's a GTK GUI for Seti@Home
  • So are you happier that your team "won", or that there's evidence of alien life?

    (and do you win $10000 in Alpha Centauri duckets?) ;)
  • I think that SETI@Home really wins the usefulness contest. --- However, perhaps we should finish the one contest, that we know we can finish, first! If all but one person leave the DES contest, that one person will eventually get rich while the rest of us give our CPU fans a workout.

    -Ben
  • Too late! A NASA probe, launched several years ago, proved there was no intelligent life on Earth. (I think the article was in Nature.)
  • Penguin Power?
  • If the code is based on what they publicly released, it's a C++ hash-up. A total disaster, that should have been re-written from scratch. (I was very nearly going to do just that, but the volunteer programmers were advised not to. Then they took the code away from us & gave it to a professional company, who no doubt botched it up further.)

    If SETI@Home wanted decent, high-performance code, anyone reading this board could do that, but not with our hands tied.

  • Wire up a satellite dish, point it to the same bit of sky, and see if you get something comparable (minus resolution & gain)
  • I want my Linux! Aaargh!!

    I'm running NT5 SP5 on a Compaq PII machine. THREE DAYS PER SODDING WORK UNIT!

    If Linux gets it down to one, I might not have pulled all the hair out of my head by the end of the week!

  • Through intelligent lifeforms from outerspace trying to hide themselves :)
  • Except it would look for signs of intelligent life in the packet, NOT try to decrypt it :).
  • Maybe I'm being paranoid, but did anyone notice in the processor stats that i386 (bottom entry, #33) had 0 blocks received, but 1 returned? Exactly how did that situation come about?
  • How can you say that it ran fine without distributed computing? The goal of the project is to hopefully find life out there, not just look for it. Now I'm not saying that SETI will ever neccessarily find anything using present day technology, but if we can analyze the data a lot more thorougly with distributed computing, then it seems like SETI will run a lot better with it than without it.
  • by asmussen ( 2306 ) <(ten.xoc) (ta) (nessumsa)> on Sunday May 23, 1999 @12:41PM (#1881844)
    Sure, but I think that the rc5 project has already proven it's point. Even if we do finally finish rc64, all we will have proven is that it is crackable, but that it takes even distributed computing a really long time with today's technology, but I think that this point has already been adequately proven already by distributed.net. If they do finally crack the rc64 challenge, I don't really think it will add to that point at all. I think that it is already understood now that it can be cracked, and that the average time it would take to do so is really just a fairly simple mathematical exercise complicated only by unknowns like the increase in processing power from year to year, and how many people participate as time goes on. Actually doing it at this point holds little more point than the prize money in my opinion. Don't get me wrong, I think this was a vital project when it began, I just think that it accomplished what it needed to do, that's all. Now if they ever get their OGR stuff going, I might be tempted to switch back to work on that for a while, but for now I'm sticking with SETI.
  • You want a bad score? My Cyrix 233MX took nearly 70 CPU hours to complete one run in Windows 98.
  • Well, all the Linux client does is spew out a bunch of text every now and then...
  • Slashdot is doing OK, but what about a Team Linux?
  • Let me guess, the plans for the frontage road were on display at the local planning department in Alpha Centauri for fifty years. Or were they actually in the cellar of a planning ofice, at the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying "Beware of the Leopard"
  • I'd rather see more intelligent life on this planet before I spend my energy looking for it elsewhere. We have some short sighted morons making emergency presidential orders about privacy and encryption may be used to aid pedophiles and terrorists? There are a few serious flaws in that logic and reality and its a matter of principle. Until we can enlighten or remove the figureheads in office, I will waste my 60 watts of idle processing power to make a sad political statement.

    I would rather look for intelligent life elsewhere, but I think it is more urgent to look for it here first.
  • I'm not sure if the NSA has the gargantuan amounts of processing power that some people seem to think they do.

    However, given that the NSA has a track record for being ahead of the academic field of cryptography (ie. they discovered linear(??) cryptanalysis many years before the academic world did)...it would not be entirely unreasonable to claim that they developed a machine similar to Deep Crack before EFF did.

    Now, if I remember correctly, Deep Crack is optimized for DES, which in itself is optimized for hardware. I'm not sure how applicable this technology would be to other algorithms, but that's a side issue.

    While some of the paranoia about the NSA is certainly unwarrented (NSA != God), it's not unreasonable to believe that they are a few steps ahead of the rest of the world in cryptography.
  • by djw ( 3187 ) on Sunday May 23, 1999 @02:23PM (#1881851)
    ...but has anyone stopped to think that you can't tell what your processor is really computing when it's running seti@home?

    Suppose you're a government agency, and you get hold of some important encrypted data. No problem -- just dump the key into the seti@home processing queue. Instant free cycles from enthusiastic geeks all over the country, of whom many are privacy advocates who've been participating in various distributed cracking challenges over the years in attempts to protest your authoritarian policies. O, sweet irony.

    Dan Wineman [mailto]

  • Join the "AI's for ET's!" team. Who better to correspond with Extra-Terrestrial intelligence than native home-grown Artificial Intelligence?

    The concept just has a certain perfection to it. ;-)

    Join the team here [berkeley.edu]

    We joined the Seti-At-Home project two years ago, for what that's worth, but the project itself has only just begun...They have problems with server overload fairly often; please be patient as they figure out how to deal with these typical new-project problems.

  • It has been working on the same block for 197 hours. It is a P90 with 48M Ram and service pack 4.

    However my dual pentium 400 with 320M ram running 2.2.5 has crunched out about 50 of them in the same time.

    Even my poor P60 with 24 megs of ram is can do one in about a day and a half fast. That is running 2.2.9.

    Ken Broadfoot


    Ken
  • Congratulations... Obscure Benchmarking unexpectedly wins the day.
  • Umm, why did this man's comment get bumped to -1... some moderator's personal vendata? there doesn't seem to be a good reason... so CmdrTaco, or Rob Malda, when you grep the threads for your name, find the guy that did this and kick his ass.

  • Nitwit comments like what, exactly? I strongly agree that the source for this program should be made available. I won't be running it until I know I have the option (even if I don't exercise it) of knowing what the program is doing. It doesn't have to be fully free/open source, but at the very least it should be distributed as source. When you say: > A RECOMPILED BINARY ON YOUR MACHINE IS A > VARIABLE. They don't know what tweaks you put > in it. Therefore, they can't use your results. You're making assumptions about the client-server model works. In fact you're making assumptions about the source. If *I* was building something like this, I'd make damn sure that there was some form of checking so that *any* data that comes into the server site claiming to have something to say about my data chunk can be quickly spot-checked first, then subjected to more rigourous checking later if it turns out to be needed. FWIW, I asked the project co-ordinators why they didn't distribute source. I received no reply. Besides the paranoia angle (NASA=NSA) which I'm not going to discount (because I can't see the code), I would distrust the code on "mere" quality grounds. If it's true that they're not releasing the source because they believe (as this anon coward does) that people will start feeding in erroneous data, then they don't know how to program to handle data, full stop. No stupid comment is enough to halt the mighty march to open source nirvana
  • Nitwit comments like what, exactly?

    I strongly agree that the source for this program should be made available. I won't be running it until I know I have the option (even if I don't exercise it) of knowing what the program is doing.

    It doesn't have to be fully free/open source, but at the very least it should be distributed as source.

    When you say:

    > A RECOMPILED BINARY ON YOUR MACHINE IS A
    > VARIABLE. They don't know what tweaks you put
    > in it. Therefore, they can't use your results.

    You're making assumptions about the client-server model works. In fact you're making assumptions about the source. If *I* was building something like this, I'd make damn sure that there was some form of checking so that *any* data that comes into the server site claiming to have something to say about my data chunk can be quickly spot-checked first, then subjected to more rigourous checking later if it turns out to be needed.

    FWIW, I asked the project co-ordinators why they didn't distribute source. I received no reply.

    Besides the paranoia angle (NASA=NSA) which I'm not going to discount (because I can't see the code), I would distrust the code on "mere" quality grounds. If it's true that they're not releasing the source because they believe (as this anon coward does) that people will start feeding in erroneous data, then they don't know how to program to handle data, full stop.

    No stupid comment is enough to halt the mighty march to open source nirvana
  • What? We all wanna dedicate our computers to finding "intelligent life" elsewhere in the universe? We haven't found any on this planet yet, why are we looking elsewhere? Just look at the most popular writers like Jon Katz and Matt Drudge for evidence that planet earth contains no intelligent life. . .

    ** Martin
  • ...and because it came from the NSA, it would be discarded as little more than unintelligible junk :).
  • I dont see the problem with this being closed software, if it was open, people could learn how to send back falsified results, screwing them all up... Read the docs.. it tells ya a lot.. [then use yar common sense..]


    Stan "Myconid" Brinkerhoff
  • Depending on your processor, you really shouldn't need 50 hours.

    As well as running it on my Linux box (PII 333) which averages 11 something hours per block, I also run it on my PowerMac 8200/100. Its been going for over 135 hours and is on 94% of the first block. This thing eats CPU time. But hey, what the hell else would I use the Mac for? ;)
  • Bollocks. Who gives a monkeys about prime numbers? When asked the question "whats your computer doing?" what would you rather say? "its looking for a number" or "it's searching for the possibility of intelligent life outside of our solar system". I know which one I prefer. If you wanna be a boring maths junky, fine. Leave the rest of us to play with something interesting.
  • He got moderated down because of that "I'm first" tag. First posters are a no brainer for the moderators. And if it weren't for them, there might even be moderators...
  • by Gary Franczyk ( 7387 ) on Sunday May 23, 1999 @11:11AM (#1881864)
    Which one of the two systems is more useful? Distributed.net or Seti? I think Seti wins between those two... Though, isnt there another one of these types of things that looks for extremely large prime numbers? That one is probably the most useful out of them all.
  • The client is not going to know if it found intelligent life. It is going to know that it cranked out some numbers which might be suspicious. The quantitative values would be duely reported and the interesting findings checked again using dedicated telescope time. As far as any announcement, I can't imagine that it would not be reported as at least a possibility. What happens after that I don't know. Carl Sagan's book, "CONTACT" (which I highly recommend) dealt with this issue in a reasonable fashion. The movie was pretty good, too.

    -Steve
  • hmm... cost of 300000 computers ... 300,000x$1000 (min entry price for PC?) = $ 300,000,000 price of deep crack $200,000 no of deep cracks buyable with $300,000,000 = $300,000,000/$200,000 = 1500 hmmm......granted that's pure key cracking (as opposed to general computing) power, but hey. how many million does the NSA spend anyway? Deep Crack cracked rc5-56 in how many days?
  • Heh...HehHeh...BWAHAHAHAHAHAHAHAH!!!!...but why'd you tell?...just when it was getting fun...hrmph...

    Computers are neeto.

    --diva
  • I see the top member of Team slashdot is Michael Dolan with an average time per block of 9 hr 27 min. OK, he probably has an Alpha or some other fast CPU.

    But what is the story with Bert, in the number 5 slot, with an average time per block of 7 min 51 sec? What kind of system cranks through a block 70 times faster than an Alpha?

  • Team #Amiga! Beats Slashdot! For how long, I dont know... ;) I have my 6 computer working on Team Amiga! (Damn that linux client is fast) -Brook
  • This doesn't cover the case, where a lamer could fake that he didn't find anything (thus no double check untill after long time) and thus get credit for processing lots of work units. Given that some people have few minutes per work unit (bug?) one may think a lot about "security". If they had their source open, I would at least be able to fix their stupid bugs (like turning off all visual stuff) and maybe even optimization. My guts tell me that, their asm (C?) routines are somewhat far from being optimized for Intel x86.



    AtW,
    http://www.investigatio.com [investigatio.com]
  • Same login, different machines BUT different block units too. You can't process same unit in "parallel" on X machines.

    AtW,
    http://www.investigatio.com [investigatio.com]
  • And you might want to set the "minutes until blank" to 0 otherwise the screen saver activates and then the graphics shows up for "minutes until blank" then the screen blanks
  • Does anyone else find it odd that the extremely remote possiblity of finding life out there impacts a lot more people than encryption does? Granted, if we find life, it will definitely be true. But for day to day stuff, the encryption is both more relevant and more likely to impact people's lives.

  • Rather than posting your SETI@home stuff here, you should join the SETI Club [yahoo.com] @ Yahoo. We've got 370 members already and discussion about the SETI@home clients and heap of other SETI stuff is going on as you read this.

    We've also got a Team on SETI@home. You can find out info about it, along with tips on optimising your SETI@home client software on the Club Team homepage [zap.to].

    Enjoy,
    Kris.

    Win a Rio [cjb.net] (or join the SETI Club via same link)

  • by Carl Nasal ( 10625 ) on Sunday May 23, 1999 @05:10PM (#1881875) Homepage
    I have an AMD K6 233 Mhz, a P166, and a 486. I've been running rc5des on my AMD K6 and was running SETI@Home on the P166 and 486. I noticed the P166 and 486 seemed really lagged when connecting to them (even though they where connected via 10BaseT Ethernet). When doing a top, SETI@Home was using majority of CPU (how suprising), but a LOT of RAM too. This was on both machines, which really seemed to suprise me. On my AMD K6, rc5des was using almost no RAM.

    Has anyone else noticed this? I'd like to know to see if it's just me or not. Because if it's not, I can wait for new clients and hope that the RAM usage is less.
  • I read somewhare on the page (the FAQ, maybe?) that they'll list you as a co-discoverer in any articles they publish.

    It's not $10000, but it's still cool.

    peter
  • I don't think you're comparing apples and oranges. The SETI project has the possibility of finding something truly important. The best that distributed.net can do is crack an encrypted test message.

    That distributed.net will eventually crack the code is a given. All they'll have proven is that it takes a long time to crack RC5, even with lots of computers. There was never any question that it was possible -- just how long it would take. And now we know that it takes a very long time.

    SETI, on the other hand, could discover alien intelligence.

    Sure, encryption is a very relevant topic. But is distributed.net?

    peter
  • by Mr Debug ( 11822 ) on Sunday May 23, 1999 @06:22PM (#1881878)
    I guess you are right - in theory we can't really tell what anyone is up to. For all I know it could be using all the CPU time to search for intelligent life in /etc/passwd and /home/secrets then the rest of it to wrap it up in very strong encryption :-)

    But the nice thing about Linux is that you can bolt the program down so tightly (separate user, chroot) so that it cannot do any damage - it'll never find my pornography or any of my other dirty secrets ;-) (hmm, me reaches for the man chroot command anyway)

    Having said that I think it's not really feasible for these guys to give out the source code, because it allows malicious people to write something that'll send fake packets back saying "okay - I've found nothing". This would be a grossly irresponsible thing to do but I wouldn't rule out a cheat who would want to bump up the team's "block count" up a little or religious fanatics whose beliefs depend on there being nothing out there. Security through obscurity, perhaps, but I can't think of any other way of protecting against cheats.*

    Despite that I'm still a little irked off about it myself as I'm forced to sit behind a non-transparent proxy and twiddle my thumbs with a cluster of about ~16 decent machines that are just itching to join in the search for extraterrestrial life. If only I had the source I could have written that proxy bit myself already!

    *By the way it's probably only a matter of time before someone actually reverse engineers the program. Security through obscurity has always ended in tears.

  • I would rather look for intelligent life elsewhere, but I think it is more urgent to look for it here first.

    I think there is a greater chance of us finding intelligent life in outer space than there is in us finding it here! :)

  • The win client does not run very 'nice' at all so to say... When I had the client running and configured to be active constantly, I too noticed a serious responsiveness impact. Turned out the dumb thing ran at normal priority. Even switching it back by hand the the lowest priority still gave me the feeling it was clogging something. Besides, I had the DES still running as well, which I could see being totally repressed by the SETI client. Needless to say, I dumped the SETI. I don't mind spending spare cycles, but I don't want it to interfere with my normal work. I haven't tested the SETI client on an SMP box.. Maybe I'll do that this weekend. Anybody else did this?
    ----------
    'We have no choice in what we are. Yet what are we,
    but the sum of our choices.' --Rob Grant
    ----------
  • The 7 minutes is clearly a bug in the client, or a configuration problem on the computer which makes it abort the processing right away. They guy figuring high up on the list with 7 minutes processing time said he had just one single 200 MHz Pentium running NT. In other words, it should have taken 60 hours not 7 minutes (I have essentially the same setup right here).
    Presumably the others in the stats with around the same times suffer from the same bug. The setiathome folks should fix the clients and rip out the bad results from the stats. I assume it will happen at some time in the future :-)
    TA
  • To avoid the Vogon effect you're describing they should send each work unit out for processing twice, and compare the results.
    TA
  • It certainly doesn't multithread in the Unix version..
    Maybe it's the graphical part that is able to multithread (or maybe NT multithreads the graphical part for you), try to turn off the graphics (by setting the screensaver to blank the screen, in the control box). Others report that this cuts the processing time to half, if it doesn't for you but merely unloads a CPU you know what's going on..
    TA
  • Hold your horses, the database server is off-line and has been for many hours. Before there it had been acting funny for many more hours (not recognizing user names, so impossible to set up teams).
    Try again tomorrow.
    TA
  • Sure you can write your own proxy even though you don't have the source.
    A netstat shows that the client connects to sagan.ssl.berkeley.edu, and a 'strings' on the binary shows 'shserver.ssl.berkeley.edu' which turns out to be the same as 'sagan' right now.
    So just make something that can take the connects from setiathome on port 80, and forward it to shserver.ssl.berkeley.edu port 80 (and the other way). Put this 'something' (which also understands your local proxy system of course) and put it on a computer that looks like 'shserver.ssl.berkeley.edu' for the client, you can do that just by putting a fake entry in the /etc/hosts file (and if you have a /etc/nsswitch.conf then set it to check local files before DNS of course).
    You can probably do it in Perl.
    TA
  • I've noticed that the server isn't 100% reliable, too. So I run seti@home at nice 15, and distributed.net at nice 19 - in that way, when seti is not there I can at least keep my CPU's
    on temperature :-)
  • by Steelehead ( 14790 ) on Sunday May 23, 1999 @11:45AM (#1881887) Homepage
    I second that.
    Team Slashdot, we find aliens and crash wussy webservers.
    ;^)
  • For anyone from or interested in the beautiful country of Sweden, you may want to consider joining Team Sweden. At the moment I'm the first one registered for the team, but there is a group of us that was already working as a "team" before they setup real teams. Check out the stats for Team Sweden here [berkeley.edu]. There are currently about 80 machines happily processing data.
    ---
  • It is supposed to use a lot of RAM, this is also stated on the SETI@Home website. It needs the large amounts of RAM in order to process the data. I wouldn't expect to see the RAM usage drop by very much in future clients, although it would be a welcomed surprise.
    ---
  • I wouldn't put too much value in the current ranking of the teams. The work units are not being added up correctly. Take a look at Team Sweden to see what I mean. You can see the stats for Team Sweden here [berkeley.edu]. For the team it shows the total work units as 62, but the the top team member it shows the (correct) value of 80. I have already reported this bug using there bug report form, so it will be interesting to see how much the stats change once this is fixed.
    ---
  • You must be dreaming, because I don't see an easy way of getting someone elses password since the password is mailed to the email address of the owner. So, unless you can intercept other peoples email, I don't see any way for you to get other peoples passwords.
    ---
  • Well, to tell you the truth. It really doesn't surprise me. I've been convinced for a while now that this place was crawling with the script kiddy type and I guess now I've been proven right. Kinda sad if you ask me.
    ---
  • You mean to tell me they have found at least one intelligence in the universe?
    ---
  • Teams were not available last week. They just added the support for teams, and there was a story about SETI@Home when they first launched btw.
    ---
  • I just had a nice little chat with Bert (the user with the 7 minute averate). It turns out that it is caused by a bug in software (running on Winblows 95 of course). He said that it reaches 1% then jumps to 100%, so he stopped using it about 3 days ago. He also said that he didn't know anything about being added to Team Slashdot.
    ---
  • Actually, I think they took down the teams on purpose to deal with the security problems. I for one sent a bug report about it.
    ---
  • Have you guys also fixed the bug that causes the results received not to add up to all the units for each team member? It appears that it just takes the number of units that each member of a team had at the time of joining the team!
    ---
  • Hmmm, make sure you are using the 1.1 client. I was using the old clients from the beta testing in the beginning and then realized that those units don't count. Depending on your processor, you really shouldn't need 50 hours. I'm averaging about 16 hours across the 75 machines I am using.
    ---
  • There is indeed.. the Great Internet Mersenne Primes Search (GIMPS) uses spare cycles to find mersenne primes (i.e. 2^prime# - 1 , if it's prime, is a mersenne prime) you can find the page here [mersenne.org].

    -gleam
  • Acctually there are more 'useful' projects coming up on distributed as well.
    The project is about finding 'Optimal Golomb Rulers' [hewgill.com]. More info can be found here [distributed.net].

  • by Zppr ( 22841 ) on Sunday May 23, 1999 @01:39PM (#1881913)
    My Pentium-II 300 takes an average of 40 hours of CPU time to process a block running NT. I noticed that i686-pc-linux, the average time is about 11 hours. All of the average linux times are faster.

    Is the linux client faster? Or are linux users just running faster computers? Maybe it's all the graphics the Win/Mac versions draw that slow them down.
  • A couple of the people in the top 100 can do the SETI blocks in 7 minutes. Yikes.

    I special hardware involved?

    Because the SETI client doesn't multithread in it's current version, that would have to be one honkin' processor. Or a parallelizing compiler.

  • Sigh. I wish they'd implement a way to edit your setiathome user info. In fact, I think they should have done this before adding the teams - as soon as I give them my email address I'm stuck with my old settings and there's nothing I can do about it.

    Gripes aside, I'm still running the client because I think the project is so important.
  • I've put in about 50 hours of processing time or so under the Linux version so far, receiving 7 blocks up to this point. I have yet to show any results being posted to SETI@Home. I do have an outfile.txt (running around 1.5k so far). After the database came back up, I even tried to get back in using setiathome -login and logging back in. Does the outfile have to reach a 'critical mass' before being sent?

    Vrallis
  • I have been running two PC:
    Dual PII-350 (winNT sv P4, 512M SDRAM): 10-14 hours Max
    Single PII-350 (2.0.36, 128M SDRAM): 8-11 hours Max

    I disabled the graphics in Windows and it makes no difference.

    Because of that My Boss Ordered RedHat 6.0 !!

    woohoo, we going Linux PDC On the dual !! Thanks SETI@home :)

    ---


A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...