X-Git-Url: http://git.rot13.org/?p=BackupPC.git;a=blobdiff_plain;f=README.ASA;h=a1f28a2d6a6da1d7b965d40fd885df16a2b665c0;hp=b49aa2adcf8ba6144b4e283a3cc3c0f8561afca4;hb=9fecbd8a0fc6460ca8837c6b38dc9c03ed21c05e;hpb=09666eaeca95859a28b563f36d0babcb3a4390ae diff --git a/README.ASA b/README.ASA index b49aa2a..a1f28a2 100644 --- a/README.ASA +++ b/README.ASA @@ -2,11 +2,53 @@ This document tries to describe ASA extensions for BackupPC 3.2.0 Written by Dobrica Pavlinusic 2011-01-27 -Search and archive maintain data in PostgreSQL and full-text search. Since full-text search -is single-writer, we need to serialize somehow requests for it's update. +This is second iteration of adding search over arbitrary filename substrings and archive +to CD/DVD media with tracking of copies and additional md5sum creation on them for easy +burned media verification. + +ASA maintains it's data in PostgreSQL and KinoSearch (for faster part-of-filename matching). +Since full-text search is single-writer, we need to serialize somehow requests for it's update. + +Implementation is based on archive host feature in BackupPC using _search_archive.pl configuration +file located at /etc/BackupPC/pc/_search_archive.pl + +This provides us with serialization and hooks around it, but lacked incremental tar creation which +is essential because we want to burn always increasing archive on CD/DVD media. + +This is implemented using new global configuration directive TarCreateIncremental + +Using BackupPC hooks to integrate and archive host also provided following advantages: + - web interface for archive host contains our log messages + - all updates are invoked automatically on end of each run (system is always up to date) + +BackupPC can dump multiple machines in parallel, this invoking our _search_archive host and index +update while update from different machine is still in process. Archive host will reject request, +but next invocation of same host will fix problem automatically. + +To be sure that all pending archives are indexed, you can also run cron job which invokes _search_archive +on all pending increments: + + /BackupPC_ASA_ArchiveStart _search_archive backuppc + +You can also force archival of particual pending backups from single host by adding hostname(s) or +hostname:num to backup individual increment. + +Alternativly, you can use _search_archive web interface to invoke increment creation and indexing. + + + +There are two global options which had to be set for all hosts: + +# +# /etc/BackupPC/config.pl +# + +# invoke archive of dump - ASA extension DumpPostCmd was too early +$Conf{DumpPostFinishCmd} = '/srv/BackupPC/bin/BackupPC_ASA_ArchiveStart _search_archive backuppc $host'; + +# dump only incremental changes in tars not whole content - ASA extension +$Conf{TarCreateIncremental} = 1; -This is implemented using archive host feature using _search_archive.pl configuration -file in /etc/BackupPC/pc/_search_archive.pl You can manually trigger all pending backups using: @@ -15,7 +57,6 @@ You can manually trigger all pending backups using: This will start archive host _search_archive which will run it's configuration: - # # /etc/BackupPC/pc/_search_archive.pl # @@ -30,9 +71,6 @@ $Conf{ArchiveDest} = '/data/BackupPC/_search_archive'; $Conf{ArchiveComp} = 'gzip'; $Conf{CompressLevel} = 9; -# dump only incremental changes in tars not whole content - ASA extension -# XXX this option must be global in /etc/BackupPC/config.pl -$Conf{TarCreateIncremental} = 1; # archive media size (in bytes) 4.2Gb for DVD #$Conf{ArchiveMediaSize} = 4200 * 1024 * 1024; # DVD @@ -45,14 +83,15 @@ $Conf{ArchiveMediaSize} = 630 * 1024 * 1024; # CD # This is useful where the file size of the archive might exceed the # capacity of the removable media. For example specify 700 if you are using CDs. #$Conf{ArchiveSplit} = 650; -$Conf{ArchiveSplit} = 42; # FIXME small testing chunks +$Conf{ArchiveSplit} = 100; # FIXME small testing chunks # The amount of parity data to create for the archive using the par2 utility. # In some cases, corrupted archives can be recovered from parity data. -$Conf{ArchivePar} = 0; -$Conf{ParPath} = undef; - +$Conf{ArchivePar} = 30; +$Conf{ParPath} = '/srv/par2cmdline-0.4-tbb-20100203-lin64/par2'; +# http://chuchusoft.com/par2_tbb/download.html +# par2cmdline 0.4 with Intel Threading Building Blocks 2.2 # use parallel gzip (speedup on multi-code machines) $Conf{GzipPath} = '/usr/bin/pigz';