Backup strategies

Sunbeam25
Sunbeam25's picture

Joined: 2007-03-24
Posts: 34
Posted: Mon, 2008-01-07 18:44

Backup is going to become an issue. Right now I backup my database structure (about 1MB), then I take a copy of the G2data folder and contents, about 1500 photos and growing fast.

It would be nice to have the option of only backing up images that have been added since the last backup. Then I can take incrementals with the occasional full backup.

Some of you peeps must have galleries with zillions of photos. How do you manage backup?

Thanks

Alan

 
floridave
floridave's picture

Joined: 2003-12-22
Posts: 27300
Posted: Mon, 2008-01-07 19:54

If we started to support incremental backups then the support, documentation and complexity of restoration would be a issue.
The size would not change as you would still have to keep a copy of the items. What I do is keep 2 backups and delete the older ones to conserve disk-size and administration issues.

FYI G2.3 does have a backup and restore feature.

Dave
_____________________________________________
Blog & G2 || floridave - Gallery Team

 
Sunbeam25
Sunbeam25's picture

Joined: 2007-03-24
Posts: 34
Posted: Mon, 2008-01-07 20:49

I'm thinking more about bandwidth than size. I need to download backups from the hosting site to a PC over Broadband.

My suggestion would be to have 'backed up' flag in the DB set when files are added/renamed, then clear the flag as they are backed up. Ignore deleted files, restore will ignore them anyway, or they can be cleaned up after restore.

Full backup clears all the flags.

I wouldn't keep the incremental backups separate, I would immediately overlay each incremental onto the current backup at the backup site, then back that up.

I agree that it's more complex than 'backup the whole lot' though. Looking forward to 2.3. 8o)

Alan

 
jshaver

Joined: 2008-01-13
Posts: 6
Posted: Sun, 2008-01-13 23:55

have you looked into rsync? My understanding is that this is exactly what it is designed for: synchronizing large file sets over low bandwidth connections.

 
Sunbeam25
Sunbeam25's picture

Joined: 2007-03-24
Posts: 34
Posted: Mon, 2008-01-14 10:39

I'll look into rsynch thanks.

Alan

 
Sunbeam25
Sunbeam25's picture

Joined: 2007-03-24
Posts: 34
Posted: Tue, 2008-01-15 14:55

Rsync's no good for us because it requires a Daemon at one or other end (unless I've misunderstood - but it makes sense) and we can't do that.

We may opt for an FTP style solution as we ahave FTP at both ends! We'll need something to compare the directories to decide what needs to be FTPd across though.

 
jshaver

Joined: 2008-01-13
Posts: 6
Posted: Tue, 2008-01-15 20:17

Yes rsync does require a demon on both ends generally. I believe it can also be used over file sharing (SMB/NFS), but you likely don't have that ether. I would think you could run a script on both boxes that would create a file list (maybe with a simple hash type function if your worried about changes to files) and then have one download that, compare them, and synchronize the files. it would not seem to be a terribly difficult problem, although it would take some work.