[CLUE-Tech] PERL Q
David Anselmi
anselmi at intradenver.net
Sat Jun 2 22:12:49 MDT 2001
Hmmm. I don't think zombie processes are a problem. They should get
cleaned out eventually. Or, instead of exiting, you could have the parent
wait and then exit. A waiting process (or many) shouldn't be much
overhead. Fork is what you want I think (and you don't need an exec if the
child is the same program as the parent).
Your $dummy=`perl loccrawler \"$new_url\" &` idea still forks (and probably
execs too, and maybe even forks and execs a shell first). So I think this
doesn't get you anything. How about a simple program that doesn't fork, but
gets called from a loop in the shell:
bash$ while true ; do perl loccrawler $new_url & ; done
Of course you can do the loop in perl too. I think a loop is a better
approach than your cascade (infinite recursion). Perhaps a wizard can
comment on whether there's a difference in how quickly processes get
created. My guess is that the loop will get the most cpu time as the perl
processes block on I/O, so you'll hit your max processes limit before many
of them return. But that's just a guess.
Dave
Grant Johnson wrote:
> Tim Russell wrote:
> >
> > What about fork? That's the way you do it in C.
> >
> > You call fork() and look at the return value - the parent gets one
> > value, and the child gets another. Both run at the same time.
> >
> > I think a "while (true) fork()" would do the trick for you...
> >
> > Tim
>
> Unfortunately, fork() makes it a chile process. These are very much
> "fire and forget." I want the first process to be able to stop running,
> and leave everything else going without causing zombies. I may look at
> fork() followed by exec().
More information about the clue-tech
mailing list