Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

jbsmithmac

macrumors regular
Original poster
Sep 11, 2011
244
0
I've been playing with Automator lately to create a backup of my websites files (opening Transmit, syncing, creating an archive of all the files timestamped).

That said I only need to keep X number of these archives, so I was trying to figure out how to create a program/script that would delete the oldest files in the specified "backup" directory in order to maintain X number of archives.

My first problem is I haven't been successful in getting automator to find the archives in the specified folder - none of them. When I use Find/Filter Finder Items it always returns no results.

I think this may have to do with the directory being on a NAS (Segate GoFlex Home). I could put it on my local HD but I don't want to since it's quite a lot of space.

Any ideas on how to do this, and then maintain X number of archives in that directory?

(a secondary question is - does anyone know how to run a script from my local machine that will do a mysqldump of my mysql db's?...right now I'm just manually downloading this to correspond with my automated backup).
 
Is it a complete necesity to make it all in Automator?
If not... then some thougts:

Are you compressing the backups? (Or maybe you see it as cumbersome?).
If not but want to, you could call a bash script that does it for you at the end, and so you'll end up with a filename to handle for an entire backup.
If not and don't want to, you could still call a bash script to move the most recent files to folders with a date hierarchy (not sure if you do this already).

Whether you are compressing or not the backups (single .tgz file for instance, vs a folder for each date), you could then have another bash script called at the end, to make the cleanup for you.
For instance, this one, in the current directory, would delete the oldest file
Code:
rm -f "$(ls -1rt | grep ^d | head -n 1)"
Or this one, would delete the oldest folder
Code:
rm -fr "$(ls -1rt | grep d | head -n 1)"
Of course there are many ways to accomplish that using some bash scripting magic, but that's just what I can think off right now.

To the second question, there are also many ways of doing it (you could, again, have Automator call a bash script), I was thinking particularly of something using wget... but then I remember MySQL comes with a tool called mysqldump, is that what you're using to dump the data? (Or something like PhpMyAdmin perhaps?). If you have that tool installed, then it's trivial to execute a command and voilà.

If you like the idea of using bash scripting then lemme know, I could lend a hand (although google is obviously faster).
 
My first problem is I haven't been successful in getting automator to find the archives in the specified folder - none of them. When I use Find/Filter Finder Items it always returns no results.

The Find Finder Items and Filter Finder Items actions use spotlight metadata to produce their results. More info here

Check out the mdutil command in Terminal to turn indexing on or off and print indexing status.

Code:
Aldebaran:~ test$ mdutil 
Usage: mdutil -pEsa -i (on|off) volume ...
	Utility to manage Spotlight indexes.
	-p		Publish metadata.
	-i (on|off)	Turn indexing on or off.
	-E		Erase and rebuild index.
	-s		Print indexing status.
	-a		Apply command to all volumes.
	-v		Display verbose information.

Aldebaran:~ test$ mdutil -vs /Volumes/LeopardFirewire
/:
	Indexing enabled.
	Scan base time: 2013-05-04 05:56:55 +0200 (35487 seconds ago), reasoning: 'scan not required - avoided catchup scan'
 
Is it a complete necesity to make it all in Automator?
If not... then some thougts:

Are you compressing the backups? (Or maybe you see it as cumbersome?).
If not but want to, you could call a bash script that does it for you at the end, and so you'll end up with a filename to handle for an entire backup.
If not and don't want to, you could still call a bash script to move the most recent files to folders with a date hierarchy (not sure if you do this already).

Whether you are compressing or not the backups (single .tgz file for instance, vs a folder for each date), you could then have another bash script called at the end, to make the cleanup for you.
For instance, this one, in the current directory, would delete the oldest file
Code:
rm -f "$(ls -1rt | grep ^d | head -n 1)"
Or this one, would delete the oldest folder
Code:
rm -fr "$(ls -1rt | grep d | head -n 1)"
Of course there are many ways to accomplish that using some bash scripting magic, but that's just what I can think off right now.

To the second question, there are also many ways of doing it (you could, again, have Automator call a bash script), I was thinking particularly of something using wget... but then I remember MySQL comes with a tool called mysqldump, is that what you're using to dump the data? (Or something like PhpMyAdmin perhaps?). If you have that tool installed, then it's trivial to execute a command and voilà.

If you like the idea of using bash scripting then lemme know, I could lend a hand (although google is obviously faster).

It doesn't have to be Automator it's just that I'm new to this and that is the easiest right now. I do have my Automator program working pretty well - with some applescript included.

Here's an outline.

1) open Transmit and synchronize the website to the local HD
2) create archive of what was just synchronized in a separate directory
3) evaluate said archive directory for files created older than 60 days and move them to the trash
4) move archive directory to external HD/Dropbox/etc (this is my applescript)

So I think this takes care of the backup of the files...now on to the mysql databases. Right now I'm manually going into phpmyadmin on my host and exporting the sql file for the db. It's important to note that I don't have shell access to my server. So what I was wondering if I could do is run some script, bash or otherwise, locally on my mac that would be able to somehow create the mysql db export (sql file)...perhaps it's not possible without shell access to the server.

Again - any thoughts are greatly appreciated.
 
I assume then, you are not yet "automating" the transmit interaction via Automator yet?

To be honest I like AppleScript and Automator but since I'm a developer I tend to either use bash for simple things or then something soft and uncomplicated such as a python/perl/ruby script for the rest if required.

Notice it's dead easy to copy a remote directory locally (which is what you're doing via transmit) using something like scp.
Example:
Code:
scp -r login/password@server:/var/www/yourwebsite/ .
So (1) and (2) are covered.
Number (3) was mildly covered with my previous reply, but you could also say, delete "all files older than 60 days" with something like
Code:
find /backup_path/* -type f -mtime +60 -delete

Now, if you don't have shell access to your server, it still can be done. I'm sure there is a fool-proof way (sometimes I overcomplicate things), but I'd simply check how phpMyAdmin executes the backup, and see if I could issue that url (with parameters) via wget. One could look at phpMyAdmin sources, or, one could use a sniffer like Wireshark (easier).

Basically, I've used wget countless times to
1. Log into a website, saving the cookie.
2. Issue a "command" (call url with right parameters, either GET or POST) using the cookie.
In theory this should work and you'd need two wget invocations for it to start downloading your database dump locally.
In short if the last part was confusing, I'm talking about using wget as you'd use your browser (login, click something to start backup), but in an automated way of course.

I'm always "recommending" so far to use bash scripting via invocation on an Automator "Workflow" (is that the name for it?), which I'm sure you can, but you could also create a cron job to get the script running every N hours, days, whatever.

I'm interested in your post, since I'd do something similar once (and if I ever) get my website running, except I don't really like MySQL much (more of a PostgreSQL guy nowadays), but I could install it on my local server in conjunction with phpMyAdmin and try out the wget thingy I talk about, if you want and think it's something you'd desire (bash scripting).

Also notice, when I say bash scripting I'm overstating things... your backup workflow is not that complicated so that you'd have to become a master of bash scripting hahah... it just takes a few, un-linked commands to do the whole thing.
 
Last edited:
Well I've got it working but not using any bash or shell scripting...just didn't have the time to learn that. To solve the mysqldump (database backup issue) I wrote some PHP code to create the sql files for each database that's stored on the webhost. I couldn't use the mysqldump command because my webhost has the exec command disabled.

Here's what I did that's a combo of automator and applescript...

1) open specific URL's to create the database dump sql files

2) use the Transmit action to sync the web root to my local machine (this means it only copies down files that have changed...not a full download)

3) open specific URL to delete the database dump sql files from the server - the files created in step 1

4) create an archive of all website files including the new ones downloaded in step 2. Since step 2 downloads the whole web root this also includes the database backup sql files. Save the archive date appropriate.

5) Applescript - delete old backup archives if they meet the specific time paramter based on its creation date

6) Applescript - copy latest archive (step 4) to the external/cloud storage

7) Applescript - move latest archive on the local HD to the trash and empty (I have a SSD and don't want it filling up with archives).

This whole process takes about 5 min - for reference the sites I'm backing up total roughly 500MB.

NOTE - steps 5 - 7 are all in one applescript.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.