Documentation

Database Administrator's Guide

Previous Topic

Next Topic

FairCom Database Forward Roll Guide

The forward dump utility, ctfdmp, can be used to recover from a catastrophic failure following the successful execution of a dynamic dump or from a full backup made after a safe, clean, controlled shutdown of the system.

Note: If you perform a rebuild or a compact on a data file that has transaction logging enabled, you will not be able to roll that file forward past the time of the rebuild/compact operation until a new backup has been completed. The act of compacting or rebuilding a file causes changes to the file that are not stored in the transaction logs. Attempting to roll forward will fail with “FWD: Roll-Forward Error Termination...12” unless the !SKIP option is enabled, and the forward roll operation will then proceed, excluding the affected files, which will be listed in CTSTATUS.FCS.

Preparing to Use the Forward Dump Utility

To prepare for using the forward dump utility, ctfdmp, follow these guidelines:

  1. Set the KEEP_LOGS configuration option to retain all log files. This setting causes log files no longer required for automatic recovery to be renamed instead of deleted. The extensions of log files are changed from .FCS to .FCA, which changes the transaction log files from “active” to “inactive”. These “old” log files may be needed to roll forward.
  2. Make periodic, complete backups using a dynamic dump or offline backup. The following files must be included in a complete backup:
    • All data and index files.
    • The file FAIRCOM.FCS.
    • The S*.FCS files (automatically included in dynamic dump).

    Note: Once all necessary files have been backed up, the transaction log files (L*.FCS) may be deleted with one exception: DO NOT DELETE the most recent active transaction log file, which is the file of the form L<seven-digit number>.FCS with the highest valued seven-digit number.

  3. Following a safe, complete backup, save all transaction log files created until the next complete backup. Active transaction log files have names of the form L<log number>.FCS, with the number incremented by 1 for each new active transaction log. As specified in the KEEP_LOGS configuration value, when the FairCom Server creates a new active log it renames the active log being replaced from L<log number>.FCS to L<log number>.FCA and saves it as an inactive transaction log file.

    Normally, when archiving all the logs, you would reestablish your forward roll starting point on a regular basis by means of a new dynamic dump. This could be done on a weekly basis keeping the previous week's dump and accumulation of logs as a further backup (a "grandfather" approach). It is easy to automate the archiving since their server automatically renames inactive logs to *.FCA and once renamed, the server is not going to access them so they can be archived without causing problems for the server.

Running the Forward Dump Utility for System Recovery

If the system has a catastrophic failure and preparations have been made as recommended above, the data can be recovered as follows:

  1. Restore the contents of the most recent backup, which can be a dynamic dump or a standard backup, provided it includes the files listed in step 2 above.

    Note: If the restore is from a dynamic dump, be sure to include the !FORWARD_ROLL keyword in the dump recovery script. This keyword causes creation of a transaction start file for the recovered logs. The transaction start file will be named S*.FCA. After the restore is complete, rename S*.FCA to S*.FCS.

  2. Load all transaction log files saved between the time of that backup and the time of the catastrophic failure and rename all inactive transaction files in this group (i.e., log files with the extension .FCA) to give them the extension of an active transaction log file (i.e., extension .FCS).

    The following files should be present: the S*.FCS file created by ctrdmp, the data and index files restored from the dynamic dump, and all L*.FCS and L*.FCA (renamed to L*.FCS) files that have been archived in the default directory.

  3. Start the forward dump utility, ctfdmp, as any other program in the environment. See below for command-line arguments and other considerations. The ctfdmp utility can be used only when the FairCom Server is stopped unless you follow the guidelines listed later in this section.

    The forward dump will proceed without any further instructions.

    Note: Only transaction-processed files will be updated beyond the state they held when the backup was made.

Command-Line Arguments

ctfdmp accepts the command line arguments shown below. The first two arguments need to be used only if the application uses more than the default number of #FCB or the PAGE_SIZE is larger than default. If either of the first two command line arguments is used, they both must be specified as illustrated below. !SKIP is optional and does not cause an error termination if a file required during the forward roll is not accessible. Extreme care must be exercised if !SKIP is used, since the forward roll has no way of ensuring the integrity of data for files that are skipped.

CTFDMP [!#FCB <number of files>]

[!PAGE_SIZE <bytes per buffer page>]

[!SKIP]

Using ctfdmp while FairCom Server is Running

It is recommended that the ctfdmp utility should be used only when the FairCom Server is stopped. The utility can be used when the FairCom Server is running only if:

  • the utility is run in a directory other than the directory in which the server stores its transaction logs

    and

  • the files being rolled forward are not in use by the server.

Note: If the dynamic dump data was encrypted with Advanced Encryption (for example, AES), then the ctsrvr.pvf password file must be present with the master key information to decrypt and play back this data.

Directories

This utility does not read the script or the ctsrvr.cfg file.

If local_directory is used in the ctsrvr.cfg file, the local_directory path supplied is not part of the file name stored in the transaction logs, and therefore ctfdmp would expect to find the data and index files in the process working directory. For example, references to the file “foo.dat” in the transaction logs will not contain the relative path “data” so all the data, index, and *.FCS files would need to be in the same directory.

See Also:

TOCIndex