Of all the configuration files needed to run Bacula, the Director's is the most complicated, and the one that you will need to modify the most often as you add clients or modify the FileSets.
For a general discussion of configuration files and resources including the data types recognized by Bacula. Please see the Configuration chapter of this manual.
Director resource type may be one of the following:
Job, JobDefs, Client, Storage, Catalog, Schedule, FileSet, Pool, Director, or Messages. We present them here in the most logical order for defining them:
The Director resource defines the attributes of the Directors running on the network. In the current implementation, there is only a single Director resource, but the final design will contain multiple Directors to maintain index and media database redundancy.
If you have specified a Director user and/or a Director group on your ./configure line with --with-dir-user and/or --with-dir-group the Working Directory owner and group will be set to those values.
The PID directory specified must already exist and be readable and writable by the Bacula daemon referencing it
Typically on Linux systems, you will set this to: /var/run. If you are not installing Bacula in the system directories, you can use the Working Directory as defined above. This directive is required.
Please note that the Volume format becomes much more complicated with multiple simultaneous jobs, consequently, restores can take much longer if Bacula must sort through interleaved volume blocks from multiple simultaneous jobs. This can be avoided by having each simultaneously running job write to a different volume or by using data spooling, which will first spool the data to disk simultaneously, then write each spool file to the volume in sequence.
There may also still be some cases where directives such as Maximum Volume Jobs are not properly synchronized with multiple simultaneous jobs (subtle timing issues can arise), so careful testing is recommended.
At the current time, there is no configuration parameter set to limit the number of console connections. A maximum of five simultaneous console connections are permitted.
DirAddresses = { ip = { addr = 1.2.3.4; port = 1205;} ipv4 = { addr = 1.2.3.4; port = http;} ipv6 = { addr = 1.2.3.4; port = 1205; } ip = { addr = 1.2.3.4 port = 1205 } ip = { addr = 1.2.3.4 } ip = { addr = 201:220:222::2 } ip = { addr = bluedot.thun.net } }
where ip, ip4, ip6, addr, and port are all keywords. Note, that the address can be specified as either a dotted quadruple, or IPv6 colon notation, or as a symbolic name (only in the ip specification). Also, port can be specified as a number or as the mnemonic value from the /etc/services file. If a port is not specified, the default will be used. If an ip section is specified, the resolution can be made either by IPv4 or IPv6. If ip4 is specified, then only IPv4 resolutions will be permitted, and likewise with ip6.
Please note that if you use the DirAddresses directive, you must not use either a DirPort or a DirAddress directive in the same resource.
The following is an example of a valid Director resource definition:
Director { Name = HeadMan WorkingDirectory = "$HOME/bacula/bin/working" Password = UA_password PidDirectory = "$HOME/bacula/bin/working" QueryFile = "$HOME/bacula/bin/query.sql" Messages = Standard }
The Job resource defines a Job (Backup, Restore, ...) that Bacula must perform. Each Job resource definition contains the name of a Client and a FileSet to backup, the Schedule for the Job, where the data are to be stored, and what media Pool can be used. In effect, each Job resource must specify What, Where, How, and When or FileSet, Storage, Backup/Restore/Level, and Schedule respectively. Note, the FileSet must be specified for a restore job for historical reasons, but it is no longer used.
Only a single type (Backup, Restore, ...) can be specified for any job. If you want to backup multiple FileSets on the same Client or multiple Clients, you must define a Job for each one.
Note, you define only a single Job to do the Full, Differential, and Incremental backups since the different backup levels are tied together by a unique Job name. Normally, you will have only one Job per Client, but if a client has a really huge number of files (more than several million), you might want to split it into to Jobs each with a different FileSet covering only part of the total files.
When the job actually runs, the unique Job Name will consist of the name you specify here followed by the date and time the job was scheduled for execution. This directive is required.
For a Backup Job, the Level may be one of the following:
If all the above conditions do not hold, the Director will upgrade the Incremental to a Full save. Otherwise, the Incremental backup will be performed as requested.
The File daemon (Client) decides which files to backup for an Incremental backup by comparing start time of the prior Job (Full, Differential, or Incremental) against the time each file was last "modified" (st_mtime) and the time its attributes were last "changed"(st_ctime). If the file was modified or its attributes changed on or after this start time, it will then be backed up.
Some virus scanning software may change st_ctime while
doing the scan. For example, if the virus scanning program attempts to
reset the access time (st_atime), which Bacula does not use, it will
cause st_ctime to change and hence Bacula will backup the file during
an Incremental or Differential backup. In the case of Sophos virus
scanning, you can prevent it from resetting the access time (st_atime)
and hence changing st_ctime by using the --
no-reset-atime
option. For other software, please see their manual.
When Bacula does an Incremental backup, all modified files that are still on the system are backed up. However, any file that has been deleted since the last Full backup remains in the Bacula catalog, which means that if between a Full save and the time you do a restore, some files are deleted, those deleted files will also be restored. The deleted files will no longer appear in the catalog after doing another Full save. However, to remove deleted files from the catalog during an Incremental backup is quite a time consuming process and not currently implemented in Bacula.
In addition, if you move a directory rather than copy it, the files in it do not have their modification time (st_mtime) or their attribute change time (st_ctime) changed. As a consequence, those files will probably not be backed up by an Incremental or Differential backup which depend solely on these time stamps. If you move a directory, and wish it to be properly backed up, it is generally preferable to copy it, then delete the original.
If all the above conditions do not hold, the Director will upgrade the Differential to a Full save. Otherwise, the Differential backup will be performed as requested.
The File daemon (Client) decides which files to backup for a differential backup by comparing the start time of the prior Full backup Job against the time each file was last "modified" (st_mtime) and the time its attributes were last "changed" (st_ctime). If the file was modified or its attributes were changed on or after this start time, it will then be backed up. The start time used is displayed after the Since on the Job report. In rare cases, using the start time of the prior backup may cause some files to be backed up twice, but it ensures that no change is missed. As with the Incremental option, you should ensure that the clocks on your server and client are synchronized or as close as possible to avoid the possibility of a file being skipped. Note, on versions 1.33 or greater Bacula automatically makes the necessary adjustments to the time between the server and the client so that the times Bacula uses are synchronized.
When Bacula does a Differential backup, all modified files that are still on the system are backed up. However, any file that has been deleted since the last Full backup remains in the Bacula catalog, which means that if between a Full save and the time you do a restore, some files are deleted, those deleted files will also be restored. The deleted files will no longer appear in the catalog after doing another Full save. However, to remove deleted files from the catalog during a Differential backup is quite a time consuming process and not currently implemented in Bacula. It is, however, a planned future feature.
As noted above, if you move a directory rather than copy it, the files in it do not have their modification time (st_mtime) or their attribute change time (st_ctime) changed. As a consequence, those files will probably not be backed up by an Incremental or Differential backup which depend solely on these time stamps. If you move a directory, and wish it to be properly backed up, it is generally preferable to copy it, then delete the original. Alternatively, you can move the directory, then use the touch program to update the timestamps.
Every once and a while, someone asks why we need Differential backups as long as Incremental backups pickup all changed files. There are possibly many answers to this question, but the one that is the most important for me is that a Differential backup effectively merges all the Incremental and Differential backups since the last Full backup into a single Differential backup. This has two effects: 1. It gives some redundancy since the old backups could be used if the merged backup cannot be read. 2. More importantly, it reduces the number of Volumes that are needed to do a restore effectively eliminating the need to read all the volumes on which the preceding Incremental and Differential backups since the last Full are done.
For a Restore Job, no level needs to be specified.
For a Verify Job, the Level may be one of the following:
Please note! If you run two Verify Catalog jobs on the same client at the same time, the results will certainly be incorrect. This is because Verify Catalog modifies the Catalog database while running in order to track new files.
Please note! If you run two Verify VolumeToCatalog jobs on the same client at the same time, the results will certainly be incorrect. This is because the Verify VolumeToCatalog modifies the Catalog database while running.
This command can be very useful if you have disk problems because it will compare the current state of your disk against the last successful backup, which may be several jobs.
Note, the current implementation (1.32c) does not identify files that have been deleted.
If you use the Restore command in the Console program, to start a restore job, the bootstrap file will be created automatically from the files you select to be restored.
For additional details of the bootstrap file, please see Restoring Files with the Bootstrap File chapter of this manual.
Using this feature, permits you to constantly have a bootstrap file that can recover the current state of your system. Normally, the file specified should be a mounted drive on another machine, so that if your hard disk is lost, you will immediately have a bootstrap record available. Alternatively, you should copy the bootstrap file to another machine after it is updated. Note, it is a good idea to write a separate bootstrap file for each Job backed up including the job that backs up your catalog database.
If the bootstrap-file-specification begins with a vertical bar (|), Bacula will use the specification as the name of a program to which it will pipe the bootstrap record. It could for example be a shell script that emails you the bootstrap record.
On versions 1.39.22 or greater, before opening the file or executing the specified command, Bacula performs character substitution like in RunScript directive. To automatically manage your bootstrap files, you can use this in your JobDefs resources:
JobDefs { Write Bootstrap = "%c_%n.bsr" ... }
For more details on using this file, please see the chapter entitled The Bootstrap File of this manual.
If the directive is set to no, the Storage daemon will prefer finding an unused drive, otherwise, each job started will append to the same Volume (assuming the Pool is the same for all jobs). Setting Prefer Mounted Volumes to no can be useful for those sites particularly with multiple drive autochangers that prefer to maximize backup throughput at the expense of using additional drives and Volumes. As an optimization, when using multiple drives, you will probably want to start each of your jobs one after another with approximately 5 second intervals. This will help ensure that each night, the same drive (Volume) is selected for the same job, otherwise, when you do a restore, you may find the files spread over many more Volumes than necessary.
This directive is implemented in version 1.39.22 and later. The RunScript directive behaves like a resource in that it requires opening and closing braces around a number of directives that make up the body of the runscript.
The specified Command (see below for details) is run as an external program prior or after the current Job. This is optional.
You can use following options may be specified in the body
of the runscript:
Options | Value | Default | Information |
Runs On Success | Yes/No | Yes | Run command if JobStatus is successful |
Runs On Failure | Yes/No | No | Run command if JobStatus isn't successful |
Runs On Client | Yes/No | Yes | Run command on client |
Runs When | Before|After|Always | Never | When run commands |
Abort Job On Error | Yes/No | Yes | Abort job if script returns something different from 0 |
Command | Path to your script |
Any output sent by the command to standard output will be included in the Bacula job report. The command string must be a valid program name or name of a shell script.
In addition, the command string is parsed then fed to the OS, which means that the path will be searched to execute your specified command, but there is no shell interpretation, as a consequence, if you invoke complicated commands or want any shell features such as redirection or piping, you must call a shell script and do it inside that script.
Before submitting the specified command to the operating system, Bacula performs character substitution of the following characters:
%% = % %c = Client's name %d = Director's name %e = Job Exit Status %i = JobId %j = Unique Job id %l = Job Level %n = Job name %s = Since time %t = Job type (Backup, ...) %v = Volume name
The Job Exit Status code %e edits the following values:
Thus if you edit it on a command line, you will need to enclose it within some sort of quotes.
You can use these following shortcuts:
Keyword | RunsOnSuccess | RunsOnFailure | AbortJobOnError | Runs On Client | RunsWhen |
Run Before Job | Yes | No | Before | ||
Run After Job | Yes | No | No | After | |
Run After Failed Job | No | Yes | No | After | |
Client Run Before Job | Yes | Yes | Before | ||
Client Run After Job | Yes | No | Yes | After |
Examples:
RunScript { RunsWhen = Before AbortJobOnError = No Command = "/etc/init.d/apache stop" } RunScript { RunsWhen = After RunsOnFailure = yes Command = "/etc/init.d/apache start" }
Special Windows Considerations
In addition, for a Windows client on version 1.33 and above, please take note that you must ensure a correct path to your script. The script or program can be a .com, .exe or a .bat file. If you just put the program name in then Bacula will search using the same rules that cmd.exe uses (current directory, Bacula bin directory, and PATH). It will even try the different extensions in the same order as cmd.exe. The command can be anything that cmd.exe or command.com will recognize as an executable file.
However, if you have slashes in the program name then Bacula figures you are fully specifying the name, so you must also explicitly add the three character extension.
The command is run in a Win32 environment, so Unix like commands will not work unless you have installed and properly configured Cygwin in addition to and separately from Bacula.
The System %Path% will be searched for the command. (under the environment variable dialog you have have both System Environment and User Environment, we believe that only the System environment will be available to bacula-fd, if it is running as a service.)
System environment variables can be referenced with %var% and used as either part of the command name or arguments.
So if you have a script in the Bacula
bin directory then the following lines
should work fine:
Client Run Before Job = systemstate or Client Run Before Job = systemstate.bat or Client Run Before Job = "systemstate" or Client Run Before Job = "systemstate.bat" or ClientRunBeforeJob = "\"C:/Program Files/Bacula/systemstate.bat\""
The outer set of quotes is removed when the configuration file is parsed. You need to escape the inner quotes so that they are there when the code that parses the command line for execution runs so it can tell what the program name is.
ClientRunBeforeJob = "\"C:/Program Files/Software Vendor/Executable\" /arg1 /arg2 \"foo bar\""
The special characters
&<>()@^|will need to be quoted, if they are part of a filename or argument.
If someone is logged in, a blank "command" window running the commands will be present during the execution of the command.
Some Suggestions from Phil Stracchino for running on Win32 machines with the native Win32 File daemon:
ClientRunBeforeJob = "c:/bacula/bin/systemstate.bat"
rather than DOS/Windows form:
ClientRunBeforeJob =
"c:\bacula\bin\systemstate.bat" INCORRECT
For Win32, please note that there are certain limitations:
ClientRunBeforeJob = "C:/Program Files/Bacula/bin/pre-exec.bat"
Lines like the above do not work because there are limitations of cmd.exe that is used to execute the command. Bacula prefixes the string you supply with cmd.exe /c . To test that your command works you should type cmd /c "C:/Program Files/test.exe" at a cmd prompt and see what happens. Once the command is correct insert a backslash (\) before each double quote ("), and then put quotes around the whole thing when putting it in the director's .conf file. You either need to have only one set of quotes or else use the short name and don't put quotes around the command path.
Below is the output from cmd's help as it relates to the command line passed to the /c option.
If /C or /K is specified, then the remainder of the command line after the switch is processed as a command line, where the following logic is used to process quote (") characters:
&<>()@^|
The following example of the use of the Client Run Before Job directive was
submitted by a user:
You could write a shell script to back up a DB2 database to a FIFO. The shell
script is:
#!/bin/sh # ===== backupdb.sh DIR=/u01/mercuryd mkfifo $DIR/dbpipe db2 BACKUP DATABASE mercuryd TO $DIR/dbpipe WITHOUT PROMPTING & sleep 1
The following line in the Job resource in the bacula-dir.conf file:
Client Run Before Job = "su - mercuryd -c \"/u01/mercuryd/backupdb.sh '%t' '%l'\""
When the job is run, you will get messages from the output of the script stating that the backup has started. Even though the command being run is backgrounded with &, the job will block until the "db2 BACKUP DATABASE" command, thus the backup stalls.
To remedy this situation, the "db2 BACKUP DATABASE" line should be changed to the following:
db2 BACKUP DATABASE mercuryd TO $DIR/dbpipe WITHOUT PROMPTING > $DIR/backup.log 2>&1 < /dev/null &
It is important to redirect the input and outputs of a backgrounded command to /dev/null to prevent the script from blocking.
Run Before Job = "echo test"it's equivalent to :
RunScript { Command = "echo test" RunsOnClient = No RunsWhen = Before }
Lutz Kittler has pointed out that using the RunBeforeJob directive can be a simple way to modify your schedules during a holiday. For example, suppose that you normally do Full backups on Fridays, but Thursday and Friday are holidays. To avoid having to change tapes between Thursday and Friday when no one is in the office, you can create a RunBeforeJob that returns a non-zero status on Thursday and zero on all other days. That way, the Thursday job will not run, and on Friday the tape you inserted on Wednesday before leaving will be used.
An example of the use of this directive is given in the Tips Chapter of this manual.
See the Run After Failed Job if you want to run a script after the job has terminated with any non-normal status.
RunScript { Command = "echo test" RunsWhen = After RunsOnFailure = yes RunsOnClient = no RunsOnSuccess = yes # default, you can drop this line }
An example of the use of this directive is given in the Tips Chapter of this manual.
Note, please see the notes above in RunScript concerning Windows clients.
There are several points that must be taken into account when using this directive: first, a failed job is defined as one that has not terminated normally, which includes any running job of the same name (you need to ensure that two jobs of the same name do not run simultaneously); secondly, the Ignore FileSet Changes directive is not considered when checing for failed levels, which means that any FileSet change will trigger a rerun.
This specification can be useful for portables, laptops, or other machines that are not always connected to the network or switched on.
The part after the equal sign must be enclosed in double quotes, and can contain any string or set of options (overrides) that you can specify when entering the Run command from the console. For example storage=DDS-4 .... In addition, there are two special keywords that permit you to clone the current job. They are level=%l and since=%s. The %l in the level keyword permits entering the actual level of the current job and the %s in the since keyword permits putting the same time for comparison as used on the current job. Note, in the case of the since keyword, the %s must be enclosed in double quotes, and thus they must be preceded by a backslash since they are already inside quotes. For example:
run = "Nightly-backup level=%l since=\"%s\" storage=DDS-4"
A cloned job will not start additional clones, so it is not possible to recurse.
The priority only affects waiting jobs that are queued to run, not jobs that are already running. If one or more jobs of priority 2 are already running, and a new job is scheduled with priority 1, the currently running priority 2 jobs must complete before the priority 1 job is run.
The default priority is 10.
If you want to run concurrent jobs you should keep these points in mind:
If you have several jobs of different priority, it may not best to start them at exactly the same time, because Bacula must examine them one at a time. If by Bacula starts a lower priority job first, then it will run before your high priority jobs. If you experience this problem, you may avoid it by starting any higher priority jobs a few seconds before lower priority ones. This insures that Bacula will examine the jobs in the correct order, and that your priority scheme will be respected.
It should be set to yes when writing to devices that require mount (for example DVD), so you are sure that the current part, containing this job's data, is written to the device, and that no data is left in the temporary file on the hard disk. However, on some media, like DVD+R and DVD-R, a lot of space (about 10Mb) is lost every time a part is written. So, if you run several jobs each after another, you could set this directive to no for all jobs, except the last one, to avoid wasting too much space, but to ensure that the data is written to the medium when all jobs are finished.
This directive is ignored with tape and FIFO devices.
The following is an example of a valid Job resource definition:
Job { Name = "Minou" Type = Backup Level = Incremental # default Client = Minou FileSet="Minou Full Set" Storage = DLTDrive Pool = Default Schedule = "MinouWeeklyCycle" Messages = Standard }
The JobDefs resource permits all the same directives that can appear in a Job resource. However, a JobDefs resource does not create a Job, rather it can be referenced within a Job to provide defaults for that Job. This permits you to concisely define several nearly identical Jobs, each one referencing a JobDefs resource which contains the defaults. Only the changes from the defaults need to be mentioned in each Job.
The Schedule resource provides a means of automatically scheduling a Job as well as the ability to override the default Level, Pool, Storage and Messages resources. If a Schedule resource is not referenced in a Job, the Job can only be run manually. In general, you specify an action to be taken and when.
The Job-overrides permit overriding the Level, the Storage, the Messages, and the Pool specifications provided in the Job resource. In addition, the FullPool, the IncrementalPool, and the DifferentialPool specifications permit overriding the Pool specification according to what backup Job Level is in effect.
By the use of overrides, you may customize a particular Job. For example, you may specify a Messages override for your Incremental backups that outputs messages to a log file, but for your weekly or monthly Full backups, you may send the output by email by using a different Messages override.
Job-overrides are specified as: keyword=value where the keyword is Level, Storage, Messages, Pool, FullPool, DifferentialPool, or IncrementalPool, and the value is as defined on the respective directive formats for the Job resource. You may specify multiple Job-overrides on one Run directive by separating them with one or more spaces or by separating them with a trailing comma. For example:
Date-time-specification determines when the Job is to be run. The specification is a repetition, and as a default Bacula is set to run a job at the beginning of the hour of every hour of every day of every week of every month of every year. This is not normally what you want, so you must specify or limit when you want the job to run. Any specification given is assumed to be repetitive in nature and will serve to override or limit the default repetition. This is done by specifying masks or times for the hour, day of the month, day of the week, week of the month, week of the year, and month when you want the job to run. By specifying one or more of the above, you can define a schedule to repeat at almost any frequency you want.
Basically, you must supply a month, day, hour, and minute the Job is to be run. Of these four items to be specified, day is special in that you may either specify a day of the month such as 1, 2, ... 31, or you may specify a day of the week such as Monday, Tuesday, ... Sunday. Finally, you may also specify a week qualifier to restrict the schedule to the first, second, third, fourth, or fifth week of the month.
For example, if you specify only a day of the week, such as Tuesday the Job will be run every hour of every Tuesday of every Month. That is the month and hour remain set to the defaults of every month and all hours.
Note, by default with no other specification, your job will run at the beginning of every hour. If you wish your job to run more than once in any given hour, you will need to specify multiple run specifications each with a different minute.
The date/time to run the Job can be specified in the following way in pseudo-BNF:
<void-keyword> = on <at-keyword> = at <week-keyword> = 1st | 2nd | 3rd | 4th | 5th | first | second | third | fourth | fifth <wday-keyword> = sun | mon | tue | wed | thu | fri | sat | sunday | monday | tuesday | wednesday | thursday | friday | saturday <week-of-year-keyword> = w00 | w01 | ... w52 | w53 <month-keyword> = jan | feb | mar | apr | may | jun | jul | aug | sep | oct | nov | dec | january | february | ... | december <daily-keyword> = daily <weekly-keyword> = weekly <monthly-keyword> = monthly <hourly-keyword> = hourly <digit> = 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 0 <number> = <digit> | <digit><number> <12hour> = 0 | 1 | 2 | ... 12 <hour> = 0 | 1 | 2 | ... 23 <minute> = 0 | 1 | 2 | ... 59 <day> = 1 | 2 | ... 31 <time> = <hour>:<minute> | <12hour>:<minute>am | <12hour>:<minute>pm <time-spec> = <at-keyword> <time> | <hourly-keyword> <date-keyword> = <void-keyword> <weekly-keyword> <day-range> = <day>-<day> <month-range> = <month-keyword>-<month-keyword> <wday-range> = <wday-keyword>-<wday-keyword> <range> = <day-range> | <month-range> | <wday-range> <date> = <date-keyword> | <day> | <range> <date-spec> = <date> | <date-spec> <day-spec> = <day> | <wday-keyword> | <day-range> | <wday-range> | <daily-keyword> <day-spec> = <day> | <wday-keyword> | <day> | <wday-range> | <week-keyword> <wday-keyword> | <week-keyword> <wday-range> <month-spec> = <month-keyword> | <month-range> | <monthly-keyword> <date-time-spec> = <month-spec> <day-spec> <time-spec>
Note, the Week of Year specification wnn follows the ISO standard definition of the week of the year, where Week 1 is the week in which the first Thursday of the year occurs, or alternatively, the week which contains the 4th of January. Weeks are numbered w01 to w53. w00 for Bacula is the week that precedes the first ISO week (i.e. has the first few days of the year if any occur before Thursday). w00 is not defined by the ISO specification. A week starts with Monday and ends with Sunday.
An example schedule resource that is named WeeklyCycle and runs a job with level full each Sunday at 1:05am and an incremental job Monday through Saturday at 1:05am is:
Schedule { Name = "WeeklyCycle" Run = Level=Full sun at 1:05 Run = Level=Incremental mon-sat at 1:05 }
An example of a possible monthly cycle is as follows:
Schedule { Name = "MonthlyCycle" Run = Level=Full Pool=Monthly 1st sun at 1:05 Run = Level=Differential 2nd-5th sun at 1:05 Run = Level=Incremental Pool=Daily mon-sat at 1:05 }
The first of every month:
Schedule { Name = "First" Run = Level=Full on 1 at 1:05 Run = Level=Incremental on 2-31 at 1:05 }
Every 10 minutes:
Schedule { Name = "TenMinutes" Run = Level=Full hourly at 0:05 Run = Level=Full hourly at 0:15 Run = Level=Full hourly at 0:25 Run = Level=Full hourly at 0:35 Run = Level=Full hourly at 0:45 Run = Level=Full hourly at 0:55 }
Internally Bacula keeps a schedule as a bit mask. There are six masks and a minute field to each schedule. The masks are hour, day of the month (mday), month, day of the week (wday), week of the month (wom), and week of the year (woy). The schedule is initialized to have the bits of each of these masks set, which means that at the beginning of every hour, the job will run. When you specify a month for the first time, the mask will be cleared and the bit corresponding to your selected month will be selected. If you specify a second month, the bit corresponding to it will also be added to the mask. Thus when Bacula checks the masks to see if the bits are set corresponding to the current time, your job will run only in the two months you have set. Likewise, if you set a time (hour), the hour mask will be cleared, and the hour you specify will be set in the bit mask and the minutes will be stored in the minute field.
For any schedule you have defined, you can see how these bits are set by doing a show schedules command in the Console program. Please note that the bit mask is zero based, and Sunday is the first day of the week (bit zero).
-