Need to zip ~170,000 files

Discussion in 'Mac Apps and Mac App Store' started by shinji, Jun 17, 2009.

  1. shinji macrumors 65816

    shinji

    Joined:
    Mar 18, 2007
    #1
    I am trying to zip a folder of 170,000 files, which are part of a large website.

    Trying to add them to betterzip ends up crashing betterzip.

    Trying to select them all and then compressing them through finder makes finder stop responding. I tried this a few different ways, and the second to last time I was able to get it to start but it said it would take 4 hours.

    What app should I use for this? Is zip even the best way? Are different compression algorithms faster for the compression? This is going to be extracted on a server.
     
  2. jbernie macrumors 6502a

    jbernie

    Joined:
    Nov 25, 2005
    Location:
    Denver, CO
    #2
    ok, so I don't know the Mac side of things, but just using normal troubleshooting.

    1) can you zip all these files into smaller groups ie 10k per file?
    2) why not create a zip file with say 1k files and then add files?
    3) 170k files!?!?!? how fricken huge is that going to be!?!?!?!?!
    3a) maybe the total size of the files exceeds memory
    3b) monitor your system resources when doing this and see if you max out the memory, processor should be irrelevent.
    4) even if you zip them all up, how are you going to move them to the server? over a network sounds painful, maybe use multiple dvd's?

    Sounds like you are overwhelming the system, do it in a more controlled (though maybe slower) manner and you should get the desired result.
     
  3. shinji thread starter macrumors 65816

    shinji

    Joined:
    Mar 18, 2007
    #3
    1) I could do that I suppose...is there an automated way to create 170 archives, though? It seems like that would be more work.
    2) Tried that in Finder, wasn't able to get it.
    3) About 480 mb.
    3a) I've got 4 gigs of RAM, so that shouldn't be an issue.
    3b) Betterzip never goes past 1 gb for some reason...it hangs at that point.
    4) What I'd like is one large zip that I can just upload...was hoping to do that tonight before I went to sleep to start it, but I guess that's not going to happen.

    Is there some simple command-line way to just create an archive out of all the files in a folder? Or would this compression take even longer?
     
  4. jaykk macrumors 6502a

    Joined:
    Jan 5, 2002
    Location:
    CA
    #4
    Unix way

    Not sure if you are comfortable with Terminal.. you can try a tarball, then compress that one file

    #tar -cvf folder.tar ./folder
    #gzip folder.tar

    on server,

    #gunzip folder.tar.gz
    #tar -xvf folder.tar
     
  5. shinji thread starter macrumors 65816

    shinji

    Joined:
    Mar 18, 2007
    #5
    Thanks!

    I'm trying GUI-Tar http://www.edenwaith.com/products/guitar/ right now...if this fails, I'll just use your command line tip.
     
  6. jaykk macrumors 6502a

    Joined:
    Jan 5, 2002
    Location:
    CA
    #6
    Looks like thats what you need, its a gui for the tar command I posted. good luck.
     
  7. shinji thread starter macrumors 65816

    shinji

    Joined:
    Mar 18, 2007
    #7
    Gui-Tar only works on 4096 records, so I ended up using your code and it worked great.
     
  8. jaykk macrumors 6502a

    Joined:
    Jan 5, 2002
    Location:
    CA
    #8
    great

    I always use tar/gzip combination to move files between different devices, especially AppleTV. Glad that it worked for you.
     
  9. Jethryn Freyman macrumors 68020

    Jethryn Freyman

    Joined:
    Aug 9, 2007
    Location:
    Australia
  10. mkrishnan Moderator emeritus

    mkrishnan

    Joined:
    Jan 9, 2004
    Location:
    Grand Rapids, MI, USA
    #10
    Command line tar and associated files have long histories in dealing with this sort of thing. Tar (short for tape archive, I think) was sort of the backup program of the early Unix world... 170k files is a lot, but yeah, I've tarred entire user directories, entire drives, etc, and it handles it fine. :)
     
  11. shinji thread starter macrumors 65816

    shinji

    Joined:
    Mar 18, 2007
    #11
    Guys, I have another problem.

    The tar thing worked great and everything is on the server.

    Now I need to upload an additional 4 files, after I realized a mistake I made. I tried doing this through transmit and yummyftp but it said the upload failed...I think (from looking at the log) because neither app can process a directory listing that large?

    Is there a command line solution this, or another app you think I should try?
     
  12. jaykk macrumors 6502a

    Joined:
    Jan 5, 2002
    Location:
    CA
    #12
    try scp

    To copy one or 2 files, the easiest command is scp. Enter password when prompted.

    #scp filename username@<servername>:<server directory>


    eg:
    The following commands will copy file1.txt, file2.txt to frontrow user's home directory(~). Once you have the file there you can easily move anywhere you want.

    #scp folder/file1.txt frontrow@appletv.local:~
    #scp folder/file2.txt frontrow@appletv.local:~

    If it didnt work, I can give you command line ftp commands.
     
  13. angelwatt Moderator emeritus

    angelwatt

    Joined:
    Aug 16, 2005
    Location:
    USA
    #13
    You could use FTP from the command line. Something like,

    ftp user@domain
    cd to/path
    put local-file-name remote-file-name


    FYI, I'm not 100% on the syntax. Going from memory.
     
  14. gnasher729 macrumors P6

    gnasher729

    Joined:
    Nov 25, 2005
    #14
    Does it have to be .zip? Compressed disk images work quite well.
     
  15. Consultant macrumors G5

    Consultant

    Joined:
    Jun 27, 2007
    #15
    Yeah. Make a disk image and then compress the disk image, or make a compressed image.
     

Share This Page