Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

shinji

macrumors 65816
Original poster
Mar 18, 2007
1,335
1,518
I am trying to zip a folder of 170,000 files, which are part of a large website.

Trying to add them to betterzip ends up crashing betterzip.

Trying to select them all and then compressing them through finder makes finder stop responding. I tried this a few different ways, and the second to last time I was able to get it to start but it said it would take 4 hours.

What app should I use for this? Is zip even the best way? Are different compression algorithms faster for the compression? This is going to be extracted on a server.
 
ok, so I don't know the Mac side of things, but just using normal troubleshooting.

1) can you zip all these files into smaller groups ie 10k per file?
2) why not create a zip file with say 1k files and then add files?
3) 170k files!?!?!? how fricken huge is that going to be!?!?!?!?!
3a) maybe the total size of the files exceeds memory
3b) monitor your system resources when doing this and see if you max out the memory, processor should be irrelevent.
4) even if you zip them all up, how are you going to move them to the server? over a network sounds painful, maybe use multiple dvd's?

Sounds like you are overwhelming the system, do it in a more controlled (though maybe slower) manner and you should get the desired result.
 
ok, so I don't know the Mac side of things, but just using normal troubleshooting.

1) can you zip all these files into smaller groups ie 10k per file?
2) why not create a zip file with say 1k files and then add files?
3) 170k files!?!?!? how fricken huge is that going to be!?!?!?!?!
3a) maybe the total size of the files exceeds memory
3b) monitor your system resources when doing this and see if you max out the memory, processor should be irrelevent.
4) even if you zip them all up, how are you going to move them to the server? over a network sounds painful, maybe use multiple dvd's?

Sounds like you are overwhelming the system, do it in a more controlled (though maybe slower) manner and you should get the desired result.

1) I could do that I suppose...is there an automated way to create 170 archives, though? It seems like that would be more work.
2) Tried that in Finder, wasn't able to get it.
3) About 480 mb.
3a) I've got 4 gigs of RAM, so that shouldn't be an issue.
3b) Betterzip never goes past 1 gb for some reason...it hangs at that point.
4) What I'd like is one large zip that I can just upload...was hoping to do that tonight before I went to sleep to start it, but I guess that's not going to happen.

Is there some simple command-line way to just create an archive out of all the files in a folder? Or would this compression take even longer?
 
Unix way

Not sure if you are comfortable with Terminal.. you can try a tarball, then compress that one file

#tar -cvf folder.tar ./folder
#gzip folder.tar

on server,

#gunzip folder.tar.gz
#tar -xvf folder.tar
 
Looks like thats what you need, its a gui for the tar command I posted. good luck.

Gui-Tar only works on 4096 records, so I ended up using your code and it worked great.
 
great

I always use tar/gzip combination to move files between different devices, especially AppleTV. Glad that it worked for you.
 
Gui-Tar only works on 4096 records, so I ended up using your code and it worked great.

Command line tar and associated files have long histories in dealing with this sort of thing. Tar (short for tape archive, I think) was sort of the backup program of the early Unix world... 170k files is a lot, but yeah, I've tarred entire user directories, entire drives, etc, and it handles it fine. :)
 
Guys, I have another problem.

The tar thing worked great and everything is on the server.

Now I need to upload an additional 4 files, after I realized a mistake I made. I tried doing this through transmit and yummyftp but it said the upload failed...I think (from looking at the log) because neither app can process a directory listing that large?

Is there a command line solution this, or another app you think I should try?
 
try scp

To copy one or 2 files, the easiest command is scp. Enter password when prompted.

#scp filename username@<servername>:<server directory>


eg:
The following commands will copy file1.txt, file2.txt to frontrow user's home directory(~). Once you have the file there you can easily move anywhere you want.

#scp folder/file1.txt frontrow@appletv.local:~
#scp folder/file2.txt frontrow@appletv.local:~

If it didnt work, I can give you command line ftp commands.
 
You could use FTP from the command line. Something like,

ftp user@domain
cd to/path
put local-file-name remote-file-name


FYI, I'm not 100% on the syntax. Going from memory.
 
I am trying to zip a folder of 170,000 files, which are part of a large website.

Trying to add them to betterzip ends up crashing betterzip.

Trying to select them all and then compressing them through finder makes finder stop responding. I tried this a few different ways, and the second to last time I was able to get it to start but it said it would take 4 hours.

What app should I use for this? Is zip even the best way? Are different compression algorithms faster for the compression? This is going to be extracted on a server.

Does it have to be .zip? Compressed disk images work quite well.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.