X-Git-Url: http://git.rot13.org/?p=BackupPC.git;a=blobdiff_plain;f=README.ASA;h=65107d5cb0dccadcc8c3ae1964b46645ae07585c;hp=b49aa2adcf8ba6144b4e283a3cc3c0f8561afca4;hb=a53b5658667bedcea171e91aa4d78af8b7a3eb19;hpb=09666eaeca95859a28b563f36d0babcb3a4390ae diff --git a/README.ASA b/README.ASA index b49aa2a..65107d5 100644 --- a/README.ASA +++ b/README.ASA @@ -2,11 +2,53 @@ This document tries to describe ASA extensions for BackupPC 3.2.0 Written by Dobrica Pavlinusic 2011-01-27 -Search and archive maintain data in PostgreSQL and full-text search. Since full-text search -is single-writer, we need to serialize somehow requests for it's update. +This is second iteration of adding search over arbitrary filename substrings and archive +to CD/DVD media with tracking of copies and additional md5sum creation on them for easy +burned media verification. + +ASA maintains it's data in PostgreSQL and KinoSearch (for faster part-of-filename matching). +Since full-text search is single-writer, we need to serialize somehow requests for it's update. + +Implementation is based on archive host feature in BackupPC using _search_archive.pl configuration +file located at /etc/BackupPC/pc/_search_archive.pl + +This provides us with serialization and hooks around it, but lacked incremental tar creation which +is essential because we want to burn always increasing archive on CD/DVD media. + +This is implemented using new global configuration directive TarCreateIncremental + +Using BackupPC hooks to integrate and archive host also provided following advantages: + - web interface for archive host contains our log messages + - all updates are invoked automatically on end of each run (system is always up to date) + +BackupPC can dump multiple machines in parallel, this invoking our _search_archive host and index +update while update from different machine is still in process. Archive host will reject request, +but next invocation of same host will fix problem automatically. + +To be sure that all pending archives are indexed, you can also run cron job which invokes _search_archive +on all pending increments: + + /BackupPC_ASA_ArchiveStart _search_archive backuppc + +You can also force archival of particual pending backups from single host by adding hostname(s) or +hostname:num to backup individual increment. + +Alternativly, you can use _search_archive web interface to invoke increment creation and indexing. + + + +There are two global options which had to be set for all hosts: + +# +# /etc/BackupPC/config.pl +# + +# invoke archive of dump - ASA extension DumpPostCmd was too early +$Conf{DumpPostFinishCmd} = '/srv/BackupPC/bin/BackupPC_ASA_ArchiveStart _search_archive backuppc $host'; + +# dump only incremental changes in tars not whole content - ASA extension +$Conf{TarCreateIncremental} = 1; -This is implemented using archive host feature using _search_archive.pl configuration -file in /etc/BackupPC/pc/_search_archive.pl You can manually trigger all pending backups using: @@ -15,7 +57,6 @@ You can manually trigger all pending backups using: This will start archive host _search_archive which will run it's configuration: - # # /etc/BackupPC/pc/_search_archive.pl # @@ -30,9 +71,6 @@ $Conf{ArchiveDest} = '/data/BackupPC/_search_archive'; $Conf{ArchiveComp} = 'gzip'; $Conf{CompressLevel} = 9; -# dump only incremental changes in tars not whole content - ASA extension -# XXX this option must be global in /etc/BackupPC/config.pl -$Conf{TarCreateIncremental} = 1; # archive media size (in bytes) 4.2Gb for DVD #$Conf{ArchiveMediaSize} = 4200 * 1024 * 1024; # DVD