Jamf ServiceNow Integration

ServiceNow Asset Integration with Jamf Cloud

I wanted to start using the ServiceNow Asset Module. I need an easy way to update computer assets from other sources into ServiceNow.

One option was to do a CVS import when we buy new equipment. This method would not be to bad. As we could just create a with the new information and import it in to ServiceNow.

I thought I could come up with a better idea. I did not want to have to manually add  anything into ServiceNow.

My idea was to use a REST outbound call form ServiceNow into our JAMF Cloud instance to retrieve the asset data.

The plan was to run this job nightly so the Asset data at worst would be a day out of sync.

We enroll all of our Mac’s in Jamf before they are deployed. This way they SN will be updated when new assets are enrolled in jamf.

5 Main Steps of the integration (each link goes into details for each step)

  1. Create a Staging table to load the data
  2. Create an outbound rest message to retrieve data from Jamf
  3. Parse the data and post it to a staging table 
  4. Transform the data then place into the Asset Table.
  5. Set Script to Run on a Schedule

Create a Staging Table in ServiceNow:

We need to create a table in ServiceNow where we can store this data before we add it in to Asset. We could directly add data to Hardware Asset table. But we are going to do the Import Set Route instead.

Go to the tables page found under System Definition and click Tables

Then click the blue New button in the upperhand corner. Name the table anything you want but right down the name we will need it later for our script.

Now with the table made we can start creating our REST message.

Outbound REST Message

We need to create an outbound REST Call from ServiceNow to Jamf. This will allow us to get the data we need.

JAMF has a simple premade default report we can call via the API. If you need more information about the assets refer to JAMF documentation.

For the basic info you can hit the predefined url of:

https://yourjamfurl. jamfcloud.com/JSSResource/computers

This will return basic data about each computer in JSON format.

You will need to put in a Username and Password of JAMF user to access this page.

To set this up in SN go to outbound rest connections.

Create a new REST Call and put in the above URL.  Create a basic Auth profile and set the call to that.

For HTTP Headers we want JSON so set the following:

Name: accept Value: application/json

Method will be set as GET

Here is a example:

Parsing Data and Transform Table

Now we have the Data in a rest message encoding in JSON. We need to parse the data and then post it to the staging table.

The first step is to get the code to make the outbound REST call. This is pretty simple. Go to our outbound rest command and click the link “Preview Script Usage”

This JavaScript code will be created that creates the REST call to the requested endpoint.

You will most likely will need to add more code to this. A link to my script can be found at the bottom of the page. I will walk you through the basic steps.

First use the boilerplate code to store the REST data in an object.

It will look something like this:

var responseBody = response.getBody();

ResponseBody is the variable holding the REST response data. This is what we want to Parse.

For  JSON data in SN we will need to parse the results. We can use SN built in JSONParser. Just call it in your code that simple.

For JAMF data we will need to parse it first then make it back into a string and then parse it again.

I think this is due to JAMF response not be 100% JSON complaint. ( Big shout out to @parwood on SN Dev’s Slack, he figured this out and shared it with me).

You will want to load the data into an array and the loop through it.

Something like this:

var parser = new JSONParser();

var parsedData = parser.parse(responseBody);

var len = parsedData.computers.length;

gs.print(len);

for (var i = 0; i < len; i++) {

  var obj = parsedData.computers[i];

  var obj2 = JSON.stringify(obj);

  var parsedSecond = JSON.parse(obj2);

Now we have the data parsed and sorted in the object of parsedSecond we can post the data to the staging table

To post the data to the staging table we will need to create a new Gliderecord and post the data we want to it. This will bring the data into the staging table.

An example is below. You can see the fields on the staging table which start with restGR. are being set to fields found in the rest message. Match up the ones you want.

var restGR = new GlideRecord(‘u_name of your staging table); //Important

  restGR.initialize();

  restGR.u_make = parsedSecond.make;

  restGR.u_model = parsedSecond.model;

  restGR.u_serial_number = parsedSecond.serial_number;

  restGR.u_username = parsedSecond.username;

  restGR.sys_import_set = ‘sysid’; //Important

  restGR.insert();

Important! Make sure to create a empty import set on the important table you have created. Then grab the sysid of that import set and do the following:

restGR.sys_import_set = ‘sysiof object’; //Important

This will load all the data into the import set on the staging table.

Now you will be able to transform the data from the staging table to hardware asset table

var r = new sn_ws.RESTMessageV2(‘JAMF’, ‘Default GET’);

var response = r.execute();

var responseBody = response.getBody();

var httpStatus = response.getStatusCode();

var parser = new JSONParser();

var parsedData = parser.parse(responseBody);

var len = parsedData.computers.length;

gs.print(len);

for (var i = 0; i < len; i++) {

  var obj = parsedData.computers[i];

  var obj2 = JSON.stringify(obj);

  var parsedSecond = JSON.parse(obj2);

  var restGR = new GlideRecord(‘name of created  staging table); //Important

  restGR.initialize();

  restGR.u_make = parsedSecond.make;

  restGR.u_model = parsedSecond.model;

  restGR.u_serial_number = parsedSecond.serial_number;

  restGR.u_username = parsedSecond.username;

  restGR.sys_import_set = ‘sysid of importset’; //Important

  restGR.insert();

}

Transforming the data to hardware table

Now that you have the data you want on your staging table it is now time to create a transform map under System Import Sets. This will take the data from the staging table and add it to hardware table.

It is pretty simple. Create a transform map for the staging table you created. You can use the auto matching fields and then point which field matches to what. User to username etc.

Set the source table as the staging table you created. The target table for computer assets is the following:

alm_hardware

*Important make sure to pick a unique field like serial number and set that coalesce. This will prevent the same asset being created over and over again.

 I wanted to set all the objects to model of Computer as this job is just for Macs.

To do this I need to create a script in the Field Map. Model Category does not exist for the JAMF data. So you create a new map, with script set as the source field  and the Target field as model_category.

Script:

return “sysid of computer in hardware”; // return the value to be put into the target field

Now when the outbound REST script is run the data will be transformed and then placed in hardware asset table. And now all the computer objects show up in Asset in ServiceNow.

Running Data Import on a Schedule

The last step is to create scheduled job so we do not have to run this manually. I set mine to run once a day. So I know that the data in ServiceNow at most can only be one day difference with JAMF.

This is very easy to do.

Go to Scheduled Jobs and click new. Then pick “Scheduled Script Execution”

Set it to run Daily. You can set the time you want it run. Default is midnight.

And that is it. Your job will run once a day and update the Asset table. You can also go to the job itself and run it on demand as well.

In the future I want to added some webhook actions from JAMF. This way if something is update before

SaltStack ServiceNow Integration part 2

Part 2 of 2

Following up on my previous post I will be covering the SaltStack side of the Integration.

The first thing to do was figure out a way to capture the username from the end user in Snow and send that to SaltStack.

I used a REST API to do this.

I installed Salt-API and put in the following settings into the master config file.

rest_cherrypy:
port: pick a port number
host: hostname
ssl_crt: /etc/ssl/private/cert.pem #path to ssl key
ssl_key: /etc/ssl/private/key.pem
webhook_disable_auth: True #set this to false if you want auth enabled
webhook_url: /hook . # allows a webhook

Then the next step is to create a reactor file. This tells salt what to do when something is sent to webhook via a REST api.
Place the following config at this path /etc/salt/master.d/reactor.conf

It will look something like this:

reactor:

– salt/netapi/hook/open_vpn_reset:
– /srv/reactor/open_vpn_phone_reset.sls

When something is sent to link of ipofsaltserver:portnumber/webhooklink it will render the sls of open_vpn_reset.sls

Now you need to create the open vpn rest sls file. The one I created is below.

{% set postdata = data.get(‘post’,{}) %} # This allows you to receive the data sent to salt api and use it
open_vpn_phone_reset:
local.cmd.run:
– tgt: ‘connect’
– args:
– cmd: ./sacli –user {{postdata.username}} –lock 0 GoogleAuthLock
– cwd: /usr/local/openvpn_as/scripts

The trick to this sls is that when the data is sent to the webhook you pass a var called user with username. Salt will take this var and place it in postdata.username and the it will render.

This will allow end users to run this script without contacting operations to run the script for them.

Saltstack and ServiceNow Integration

This part 1 of 2 articles to view the second article

We use SaltStack to manage various things on our servers. Also use OpenVpn with Google Authenticator for two factor on login. We use ServiceNow for our ticketing system.

This works pretty well. Until someone gets a new phone and they need a new QR code. We have quick script that resets this.

The old workflow was end user submits a ticket asking for the account to be reset. I run the script and then tell them to re join the phone.

This is fine but do I really need to run the script myself? This gave me the idea of using self service in ServiceNow to automate this task.

The first thing I did was to create a very simple record producer that has the caller’s name and a username variable.  For caller it is a reference variable to sys_user table. If you put the following default value in it will auto sync the caller with the user who is viewing the record.

javascript:gs.getUserID();

More info regarding record producers can be found at the link below:

https://docs.servicenow.com/bundle/kingston-it-service-management/page/product/service-catalog-management/concept/c_RecordProducer.html

So once I decided that I wanted to do this, I need to figure out how to build it.

Both ServiceNow and SaltStack have REST API’s. This is how I will integrate both services.

ServiceNow Rest API documentation: https://developer.servicenow.com/app.do#!/rest_api_doc?v=kingston&id=c_TableAPI

SaltStack REST API documentation: https://developer.servicenow.com/app.do#!/rest_api_doc?v=kingston&id=c_TableAPI

After that I needed a way to get ServiceNow which is hosted in AWS to talk with the SaltStack server that is hosted behind firewalls in our VMware stack.

For this ServiceNow provides a MID server to help with this. Refer to my other blog post link below regarding the MID server.

http://edwardjamesmathison.com/2018/06/05/servicenow-mid-server-in-docker/

The I had to create a scripted REST API in ServiceNow. I set the URL endpoint to webhook url I created in SaltStack. (I will show the SaltStack side of things in my next post). Then set the HTTP headers and query parameters using the outbound rest message in ServiceNow.

http://edwardjamesmathison.com/wp-content/uploads/2018/06/Untitled-300x174.jpeg
https://docs.servicenow.com/bundle/kingston-application-development/page/integrate/outbound-rest/concept/c_OutboundRESTWebService.html

Once this is created I need to add the code to my record producer to send out the username variable to SaltStack via the REST API.

The Record Producer code  is below. Feel free to use or edit to your liking.

var username1 = producer.username + ” “;

try {

var r = new sn_ws.RESTMessageV2(‘Salt Open VPN’, ‘OpenVPn’);
r.setStringParameterNoEscape(‘var’, ”);
var body = {‘username’: username1};
var bodyText = JSON.stringify(body);
//override authentication profile
//authentication type =’basic’/ ‘oauth2’
//r.setAuthentication(authentication type, profile name);

//set a MID server name if one wants to run the message on MID
r.setMIDServer(‘dev mid’);

//if the message is configured to communicate through ECC queue, either
//by setting a MID server or calling executeAsync, one needs to set skip_sensor
//to true. Otherwise, one may get an intermittent error that the response body is null
//r.setEccParameter(‘skip_sensor’, true);

r.setRequestBody(bodyText);
var response = r.execute();
var httpStatus = response.getStatusCode();
}
catch(ex) {
var message = ex.getMessage();
}

ServiceNow MID Server in Docker

ServiceNow requires a MID server when you want to interact with resources behind firewalls. I wanted to allow my end users to reset there Open VPN Google Auth code using self service. Due to this I needed to install a MID server to reach our servers. I put this Docker to make things easier.

Below is a link for a quick explanation of SN Mid server:

https://docs.servicenow.com/bundle/jakarta-servicenow-platform/page/product/mid-server/concept/c_MIDServer.html

It is a simple Java application. I did not want to install it on bare metal. We use docker here at work. So a good reason to use docker.

I found the following git repo that had mid server in Docker. Only problem was that it was out of date and using Ubuntu for it is base image.

https://github.com/tools-proservia/sn-mid-server

So I tweaked a few things changed base image to Centos.

The next issue is that the download file for the installation changes for each ServiceNow update.

So I created a dockerhub account and created a build linked to my repo. When I update the Wget url and push the code to master, a job in docker cloud is fired off via webhook. This auto builds the docker image.

This was very helpful but I still had one more issue.  My prod SN instance is usually a version or two behind dev. I would need to use a different docker image for Prod.

My solution was to create another git branch and create a separate docker cloud build. This way I could have prod and dev branch. Depending on which branch I update it will build a different docker image.

My github and docker image links are below. Feel free to use it, fork it whatever:

https://github.com/tkojames24/SNMidServer

https://hub.docker.com/r/tkojames/snmidserver/