Integrate ServiceAdvisor to ServiceNow
This section provides general instructions for customizing the Service Advisor integration to your ServiceNow application. For more detailed information, please see the PuzzleLogic Service Advisor Installation and Deployment Guide.
1. Import the PuzzleLogic-ServiceNow update set
- Log into a ServiceNow instance as a user with administrator privileges.
- From the Navigator panel, select System Update Sets > Retrieved Update Sets.
- On the Retrieved Update Sets page, select Import Update Set from XML (below Related Links).
- Upload the plviewer-snow-updateset-XXXXXX.xml (located in the <Pentaho Install Directory>; for example, /opt/Pentaho/ plviewer-snow-updateset-XXXXXX.xml).
- Click Choose File.
- Navigate to the file plviewer-snow-updateset-XXXXXX.xml and click Open.
- Click Upload.
- After update set has been uploaded, select the new update set from the Retrieved Update Sets table.
- Click Preview Update Set to validate the package, and click Commit Update Set.
If the PuzzleLogic-ServiceNow update set was uploaded successfully, PuzzleLogic SNOW Update Set X.X will be listed in the Retrieved Update Sets table with State=Committed.
2. Customize the plKind mapping file
PuzzleLogic Service Advisor uses the plKind_class_mapping.csv file to map the source configuration item (CI) class types from the ServiceNow ITSM to PuzzleLogic object types—RegistryObject (the base object type), application, service, host, or system.
The plKind_class_mapping.csv file ships with the following ServiceNow mapping values. You can modify the current mappings or add new mappings to reflect the CI class types used by your organization.
NOTE: If a source CI class is not mapped to a valid PuzzleLogic object type, the CI class will be mapped to RegistryObject by default.
To modify the plKind_class_mapping.csv file:
- Open the plKind_class_mapping.csv in a text editor or Microsoft Excel.
NOTE: The plKind_class_mapping.csv file is located in the /Pentaho/Logs directory.
- To edit an existing entry:
- In the second column, modify the source Class type.
- In the third column, modify the PuzzleLogic object type that the source Class type will be mapped to. The PuzzeLogic object type must be one of the following values: RegistryObject, application, service, host, or system.
NOTE: Do not modify the snow value in the first column.
- To add a new Class type, create an entry with the following values:
- In the first column, enter snow.
- In the second column, enter a valid CI Class type.
- In the third column, enter the PuzzleLogic object type that the source Class type will be mapped to. The PuzzeLogic object type must be one of the following values: RegistryObject, application, service, host, or system.
- Save the file with your changes. If you are using Excel, save the file as type CSV (Comma delimited) (*.csv).
3. Configure the data integration jobs
PuzzleLogic Service Advisor ships with pre-defined data integration jobs. These jobs retrieve certain areas of data from the ITSM application and load the data into the PuzzleLogic Data Repository. The data integration jobs are executed through the run_job.sh script.
NOTE: The job files for ServiceNow are located in the <Pentaho Install Directory>/Repository/PLViewer/SNO directory.
Complete the following tasks before you run the data integration jobs.
Task 1: Set variables in the configuration properties file
The configuration properties file config_sno.properties contain runtime variables for functions such as connecting to the data source, setting search criteria to filter results, or retrieving data from columns not specified out of the box. Users can define or modify these variables.
NOTE: The default ServiceNow configuration properties config_sno.properties file is located in the <Pentaho Install Directory>/PLViewer/Conf/ directory.
- Use the Pentaho Data Integration encr.sh utility to encrypt the SNO_PASSWD password.
- On the command line, navigate to <Pentaho Install Directory>/data-integration.
- Enter the following command to encrypt the password:
encr.sh -kettle <password>
encr.sh -kettle AR#Admin#
- Copy the encrypted output for the following step.
- Open the configuration properties file config_sno.properties in a text editor.
- Set the database connection information variables for your data source.
- SNO_BASEURL = Base URL of the target ServiceNow instance; for example, http://dev1234.service-now.com.
- SNO_USERNAME = Username of a ServiceNow user account with proper read permissions.
- SNO_PASSWD = Password for the ServiceNow user account.
- SNO_URLLIMIT = Maximum number of database records that are retrieved at a time. If not specified, the default of 10,000 will be used.
- Set the SNO_DELETE variable. If this variable is set to TRUE, physical deletes in the data source will be synced and reflected in PuzzleLogic Service Advisor. If this variable is set to FALSE or any other value besides TRUE, there will not be synchronization of physical deletes. As a result, the data displayed in PuzzleLogic Service Advisor may not be accurate.
- (Optional) Set the optional (non-required) variables, as needed.
Task 2: Modify the job_last_execution.csv file
The job_last_execution.csv file contains the last date/time that each data integration job was successfully executed. When each job is executed, it will only retrieve records that have been modified, created, or deleted since the job was last run successfully.
NOTE: The job_last_execution.csv file is located in the <Pentaho Install Directory>/Logs/ directory.
When you execute a job, you have the option of specifying a start-date and end-date range to filter which records are retrieved based on the Updated field. If you specify the start- and end-dates, the timestamp value from the job_last_execution.csv file will be ignored and a new value will not be written to the file after the job has been executed.
The job_last_execution.csv file has two columns:1-Name of the job and 2-Timestamp of last execution. Out of the box, all jobs are set to a future date of 2050/01/01 12:00 AM so that no data will be retrieved until appropriate timestamps have been manually set for each job.
NOTE: For the Incident, Problem, and Change jobs, PuzzleLogic recommends that you set the timestamps to an appropriate value in the past to retrieve only the desired subset of historical data to populate the PuzzleLogic data repository. For the other jobs, you may wish to set the last modified date back far enough (e.g. 1970/01/01) to ensure all records are retrieved.
- Use a text editor to open the file job_last_execution.csv (located at <Pentaho Install Directory>/Logs/ job_last_execution.csv).
- For the lines job_get_sno_change, job_get_sno_incident, and job_get_sno_problem, set the timestamp to an appropriate value to retrieve a reasonable subset of historical data (for example, a date two-weeks prior to the current date).
NOTE: Use the format YYYY/MM/DD hh:mm:ss for the timestamp.
- Save the changes to the file.
Task 3: Modify additional configuration variables to improve data integration performance
You have the option of increasing the memory limit in Pentaho to improve the performance of the data integration process.
Out of the box, Pentaho Data Integration sets a maximum of 2048 MB for its JVM (Java virtual machine) heap size. If you have additional free memory, you can set the JVM to a higher value if desired.
NOTE: Internal testing has shown that retrieving 100,000 Incident records in one job instance required 2048 MB of memory; 250,000 Incident records required 3096 MB of memory
To increase the memory limit in Pentaho:
- Make a backup copy of the spoon.sh script (located at <Pentaho Install Directory>/ data-integration/spoon.sh).
- Use a text editor to open the spoon.sh script.
- Locate the line that starts with
PENTAHO_DI_JAVA_OPTIONS=. PENTAHO_DI_JAVA_OPTIONS="-Xms1024m -Xmx2048m -XX:MaxPermSize=256m"
- Edit the value of
-Xmx2048mto the desired value. For example, the following line increases the memory limit to 4096 MB:
PENTAHO_DI_JAVA_OPTIONS="-Xms1024m –Xmx4096m -XX:MaxPermSize=256m"
- Save the changes made to the spoon.sh script.