[CLUE-Tech] Looking for GUI download manager

Jed S. Baer thag at frii.com
Thu Oct 14 15:09:55 MDT 2004


On Thu, 14 Oct 2004 13:31:08 -0600
"George Gammel" <ggammel1 at yahoo.com> wrote:

> I don't think I could use the command option
> to put all the URL's in a text file.  It looks like all the results
> would be saved to one output file.  The list of downloads has several
> different formats (some text, some HTTP, some Perl Script, and one
> zipped XL file). If they were all in one file, it would be very hard to
> process this data later on.

Well, you're just lucky that wget is smarter than that. For example, if
you construct a file with lines such as:

<a href="http://foo.com/some/path/to/whatiwant.pl">anything</a>
<a href="http://bar.net/another/path/thefile.zip">anything</a>
<a href="http://mumble.org/stuph/bubbles.html">anything</a>

Then, what wget will save will be files named, "whatiwant.pl",
"thefile.zip", and "bubbles.html".

wget will, or won't create directories corresponding to the full URL path,
in which to store the retrieved files. So, in the above, for the first
line, you could end up with:

  ./foo.com/some/path/to/whatiwant.pl

but command line switches, as you probably have discovered already, give
you control over that.

However, one bash script with multiple wget commands will also work fine.

jed
-- 
http://s88369986.onlinehome.us/freedomsight/

... it is poor civic hygiene to install technologies that could someday
facilitate a police state. -- Bruce Schneier



More information about the clue-tech mailing list