Access Logs - FTP home!

Discussion in 'Web Design and Development' started by -hh, Jul 30, 2008.

  1. -hh macrumors 68020


    Jul 17, 2001
    NJ Highlands, Earth
    I've found that my one ISP only holds my website's raw access logs for 7 days before it automatically deletes them.

    As such, I'm trying to develop a script that can be scheduled to run 1x/day to automate the process of logging into a domain and download the newest access log file(s).

    Currently, I download the access files manually - - I just log in using my FTP Application (Transmit), select the file(s) I don't have yet and download them to the local directory where I'm storing them.

    I figure that I can automate this process ... is OS X's Automator a good choice?

    I've already started to play with Automator, and I've found that its pretty straightforward to invoke my FTP application (Transmit) and by using its 'synchronize' feature, get it to download any new files to the intended local destination.

    What's kind of nice about this approach is that if the script fails to run for some reason, the next time it runs, it will pick up however many new access log files there are (not just the newest one), plus I don't have to worry about having exact filenames, etc...

    But I have run into a problem.

    When my Automator "script" runs, it faithfully starts Transmit and selects the correct domain/account for the FTP, but then pauses because it wants the account's password to be entered. Once the password is manually entered, the rest runs OK to completion.

    Obviously, having this manual entry point of the password entry defeats the basic purpose of having a script that can run unattended.

    Question: what do I need to add/change/etc to have the script be
    able to enter the Account password for the FTP application to not
    hang?, already have the required password?

    My general thoughts are that the password for Transmit might need to
    go into the Keychain, but I don't see the obvious method with which to
    do this.

    Suggestions? Comments? Alternative approaches?


    PS: After I get this part licked, I should have an Automator script that will do exactly what I want whenever it runs. Then all I should have to do is to find out how to schedule the Automator script to run 1x/day at some odd hour. I don't expect this to be a problem, but since I don't know how to do this yet, figured that I'd mention it too.
  2. angelwatt Moderator emeritus


    Aug 16, 2005
    I do something similar. If your host supports SSH connections you can do an rsync. Here's a good tutorial on setting up rsync to get connected with your host and get some files. I have it setup to download some log files I create on one of my pages. I don't have it scheduled yet, but plan on using cron to setup the schedule. Currently I just have the command I need to run saved in a program I use called CLIX that bookmarks your common commands. This is just an option you can check out. If you don't have SSH as an option you can find tutorials on setting up a bash script for FTP-ing files then setting that script up on cron.

Share This Page