[CLUE-Talk] file-splitting

Ed Hill ed at eh3.com
Wed Dec 26 14:30:35 MST 2001


On Tue, 2001-12-25 at 23:15, David Willson wrote:
> I've been on this one for well over a day now. I'm ready to spit. I have 
> a file that is 20Gb. One file. It's a SQL Server database. I want to 
> break that file into chunks of 1Gb or less, so that I can then compress 
> the chunks into 640Mb pieces and burn those pieces to CDs.
> 
> I can't seem to find anything that will pass the 1Gb barrier when 
> file-splitting. 'tar' will go multi-volume and set-length, but it craps 
> out after processing 1Gb.
> 
> This is the command I'm using at this time:
> tar cv --multi-volume --tape-length=500000 file=blannon.tar Lannon_Data.MDF
> It goes to 500Mb on the first file, asks for a "second tape", which is 
> great, but when the second file gets to 357,800Kb, it quietly stops 
> processing, as if it had finished the job.
> 
> Just so you know, I am using GNU tar in Win2k cmd prompt.
> 
> Any help would be greatly appreciated.


This sounds *very* much like a Win2K problem.  I've used GNU tar on both
Solaris and Linux boxen to fill and rotate tapes on DDS jukeboxes (12Gig
native DDS-3 tapes with 6 tapes/cartridge so 72Gig total) and had *NO*
problems with GNU tar.

Could either boot Linux on that box (perhaps using a CDR-based distro
like the linuxcare rescue CD) or just move the hard drive to a Linux
box?

good luck,
Ed

-- 
Edward H. Hill III, PhD
Post-Doctoral Researcher    |  Email:    <ed at eh3.com>, <ehill at mines.edu>
Division of ESE             |  URL:      http://www.eh3.com
Colorado School of Mines    |  Phone:    303-273-3483
Golden, CO  80401           |  Fax:      303-273-3311

GnuPG public key:  http://www.eh3.com/eh3.gpg



More information about the clue-talk mailing list