Kieran Anson
Kilo Patron
< Previous Article     
Scheduled Data Load    

That's a wrap. From this series you'll have learned

  • Integrating with a 3rd party application securely
  • Creating an OAuth registry and securely storing credentials
  • Using an outbound REST Message for data retrieval
  • Retrieving data using low-code and pro-code
  • Using a staging table to store data before transforming it
  • Leverage IntegrationHub ETL to robustly transform data into the CMDB
  • Scheduling data loads

Some Closing Notes

To reiterate an earlier statement, this series is designed to provide you with the bulk of information you'll need to import data from Intune whilst allowing for flexibility to change logic as needed. 

What Have I Done Further

These articles formed the starting point of an integration I have done for an MSP that allows the retrieval of data from multiple Intune environments into a single CMDB. This brought a number of complexities that I wanted to share

 

  • The pro-code data retrieval method documented in these articles uses a synchronous API call and holds onto the working thread until a response is received. This could lead to some system slowness if high volumes of data are being retrieved.
    • To mitigate this, an async API call can be made using a MID Server and the executeAsync method. To process the response from the MID, a callback function can be created to process the data. I followed @Mike Moody  video demonstration here on how to do this. I recommend giving the video a watch regardless as it's really insightful on how to handle async API calls!
  • Mapping data into a multi-organisation CMDB requires the company field to be populated in order to segregate data
    • There are a few ways this can be achieved, the first one is relying on the assigned to user's company to map the company record. This is sensible, but might not work for hierarchy organisations
    • Another option is to create a CMDB table to hold OAuth entitle profiles against company records and using the company record as part of the scripted collection of data. This works really well and allows for rapid expansion. As an example, a record on this table holds the OAuth credentials, API method, filter, company and schedule record. It then leverages a generic script, similar to the one provided, to do the work. This allows for the administration to be handed off to a non-admin user to do the work
  • Logging, logging, logging. The integration commons plugin provides a great dashboard to monitor data ingestion and transformation but won't aid in capturing issues with the retrieval of data.
    • I followed a similar approach to how ITOM Discovery status' are generated; each company has it's own status to which individual logs are generated for authentication, API query and records returned. This is then linked into the integration commons data to provide an end-to-end view of the integration. Again, allowing non-admins to see how the integration is working and allows a support team to fix issues without the help of a developer.

Questions & Support

If you have any questions or issues, please feel free to leave a comment on the article and I'll get back to you as soon as possible. I want these articles to be fluid and if you have recommendations that could help others, I'll happily take them onboard and modify the article

 

 

Version history
Last update:
‎07-20-2025 12:11 PM
Updated by:
Contributors