Nexthink Integration

Akshay Jugran
Tera Expert

Hi All,

Currently I am working on Nexthink integration, which be sending the data for software installation table also data regarding the worksatation(cmdb_ci_computer) need to create a record if not present already.

So I have created a scripted rest api but i need to know if we should have a staging table included and then tranform the data to actual tables(data count could  be a lot higher as i would be discovery data), will it imapct system performance? also should i process the data using scripted rest api to send data to tables accordingly using logics directly without using staging table, or I should have an scheduled job approach.

 

Another if there are data regarding the two tables should i ask them to provide me in single payload or get the data for devices and software seprately ?(then should i use 2 staging tables for this)

 

If someone can advise me with information regarding these points.

 

Thanks,

Akshay Jugran

2 ACCEPTED SOLUTIONS

Chavan AP
Kilo Sage

Hi@Akshay Jugran - 

 

 

Staging tables are a great way to handle high-volume data. They prevent performance issues on the main tables while you’re processing the data. They also allow you to validate the data before you transform it. And they make it easy to retry failed transformations and handle errors. Staging tables are a standard practice for bulk integrations.

 

You can use two staging tables: one for computer data and one for software installation data. This would make it easier to manage, validate, and troubleshoot.

 

Alternatively, you could consider performance-related aspects:

 

- Batch process in chunks (500-1000 records).

- Use setWorkflow(false) and autoSysFields(false) for staging tables.

- Process during off-peak hours.

 

Your Flow should be :

  1. Nexthink → REST API → Staging Tables
  2. Scheduled Job → Transform → Target Tables (cmdb_ci_computer, software installation)

 

 

Glad I could help! If this solved your issue, please mark it as ✅ Helpful and ✅ Accept as Solution so others can benefit too.*****Chavan A.P. | Technical Architect | Certified Professional*****

View solution in original post

@Akshay Jugran 

 

As mentioned, considering this is a critical table I would recommend to populate the data in staging table first and then populate the data to target tables.

 

As part of your transform, make sure to apply IRE on import sets to make sure no duplicate CIs are created

 

https://www.servicenow.com/docs/bundle/zurich-servicenow-platform/page/product/configuration-managem...

 

As per community guidelines, you can accept more than one answer as accepted solution. If my response helped to answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan

View solution in original post

6 REPLIES 6

Bhuvan
Kilo Patron

@Akshay Jugran 

 

You have out of box application for integrating with Nexthink and it is a free app,

 

https://store.servicenow.com/store/app/7a22ff2e1ba46a50a85b16db234bcb51

 

Plugin : x_nexsa_cmdb_pop

 

Bhuvan_0-1757564456642.png

If this helped to answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan

@Akshay Jugran 

 

Please note app is free but you would need to take custom tables created as part of it for your licensing considerations.

 

If you would like to create your own custom integration, use Flow Designer REST API action to make outbound or inbound API call and process the payload and populate the data in target tables.

 

Considering this is a critical CMDB table, create a robust validation as part of payload processing to make sure required fields are populated and use IRE engine to validate you are not creating duplicate CI and populate the data.

 

If this helped to answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan

Yes, I have a knowledhe that there is an connector the same but the client wants to go with the api integration with nexthink team.
can you provide a information regarding the best practice to get a large volume of data to staging table then tranform it or directly process it. also regarding the payload.

 

Thanks for the reply.

@Akshay Jugran 

 

As mentioned, considering this is a critical table I would recommend to populate the data in staging table first and then populate the data to target tables.

 

As part of your transform, make sure to apply IRE on import sets to make sure no duplicate CIs are created

 

https://www.servicenow.com/docs/bundle/zurich-servicenow-platform/page/product/configuration-managem...

 

As per community guidelines, you can accept more than one answer as accepted solution. If my response helped to answer your query, please mark it helpful & accept the solution.

 

Thanks,

Bhuvan