Integration – ITSM – Schedule data integration jobs

Schedule the data integration jobs

The PuzzleLogic data integration jobs are executed through the run_job.sh script. PuzzleLogic uses a scheduling program to schedule and automatically execute the data integration jobs on an interval basis. The Cron scheduler is recommended for Linux operating systems.

Use the following syntax to execute a data integration job. See the table below for detailed information on the parameters used with the run_job.sh script

<Pentaho Install Directory>/Scripts/run_job.sh -s|-source <data_source> -t|-type <job_type> [-c|-config <config_file_name>]
-l|-log <log_file_name> [-v|-level <log_level>] [-d|-startdate <timestamp>] [-e|-enddate <timestamp>]

For example, the following command executes the Change job for data retrieval from a BMC Remedy ITSM data source using the default configuration properties file config_ars.properties. Output is logged to the output.log file with Basic level of information.

/opt/Pentaho/Scripts/run_job.sh -s ars -t change -c config_ars.properties -l output.log –v Basic

NOTE: Because the log file will grow over time, PuzzleLogic recommends that you implement an operating system-level log rotation service to help manage the size of the log file. If possible, set the log level to a less verbose option such as Error.

To schedule a data integration job in Cron:

  1. Open a terminal window and type in crontab –e to create a crontab file (or edit an existing crontab file) that will contain a schedule of the jobs to run.
  2. Create an entry in crontab, specifying the times or intervals at which to run the run_job.sh script and the parameters to use when running the script.
    $ * * * * * <Pentaho Install Directory>/Scripts/run_job.sh -s|-source <data_source> -t|-type <job_type> [-c|-config <config_file_name>] -l|-log <log_file_name> [-v|-level <log_level>] [-d|-startdate <date_timestamp>] [-e|-enddate <date_timestamp>]
    

    The following example would execute the Incident job at 15-minute intervals every day and output logging to the inc_output.log file with Basic level of information.

    $ */15 * * * * /opt/Pentaho/Scripts/run_job.sh -t incident  -c config.properties  -l inc_output.log  -v Basic
  3. Save the updated crontab file.
Parameter Required Description Valid values
-s | -source Y Data source for integration ars (for BMC Remedy)

sno (for ServiceNow)

-t | -type Y Data integration job to execute BMC Remedy
change
class
ci
cir
contacts
incident
problem
supgroupServiceNow
change
class
ci
cir
incident
problem
-c | -config N Name of the configuration properties file located in <Pentaho Install Directory >/Repository/
PLViewer/Conf.
config_ars.properties for BMC Remedy (default)

config_sno.properties for ServiceNow (default)

name of other configuration properties file appropriate for the data source

-l | -log Y Name of log file located in <Pentaho Install Directory>
/Logs
Name of valid log file.
-v | -level N Level of logging to be used. The default value of Basic is used if this parameter is omitted. Error: Show only errors.

Nothing: Does not show any output.

Minimal: Use only minimal logging.

Basic: Default logging level.

Detailed: Provide detailed logging output.

Debug: Provide detailed output for debugging purposes.

Rowlevel: Provide logging at row level. Can generate large amount of data.

-d | -startdate N Date timestamp used to filter which records to retrieve.

The job will retrieve records with a Modified Date value (in BMC Remedy) or Updated value (in ServiceNow) between the
startdate and enddate values.

 

NOTE: If startdate is specified, enddate is required.

Use one of the following formats to specify the
startdate:§ yyyy/MM/dd HH:mm:ss.SSS§ yyyy/MM/dd HH:mm:ss
-e | -enddate N Date timestamp used to filter which records to retrieve.

The job will retrieve records with a Modified Date value (in BMC Remedy) or Updated value (in ServiceNow) between the
startdate and enddate values.

 

NOTE: If enddate is specified, startdate is required.

Use one of the following formats to specify the
enddate:§ yyyy/MM/dd HH:mm:ss.SSS§ yyyy/MM/dd HH:mm:ss

NEXT: Add a data source