How to integrate SharePoint files into ServiceNow AI Search
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-08-2025 08:10 AM
What: a custom integration that automatically sends documents from SharePoint to ServiceNow to be made available via AI Search. You can then perform a search in Employee Center using terms from anywhere in the file and have the file appear in the search view in a format similar to how knowledge articles do. The difference is that when you select one of your search results, you will be brought to the file in SharePoint. This works for many file types including xlxs, docx, and ppt.
Why: this had to be custom because, in short, the OOB options do not quite work in our environment at this time. Licensing limitations account for part of this, but unclear documentation and configuration setups also proved a dead end even for ServiceNow experts like the intrepid Elijah Aromola. In my time spent researching the topic across community, it seems we are not the only ones with this problem.
Why not: I want to add a few caveats about limitations in this integration. While can be a great solution if you don’t have EC Pro or otherwise are having trouble using other methods, there are some things to consider. Firstly, for whatever reason, it seems that files have to be created or modified via One Drive in order for the body to be parsed correctly in ServiceNow. This means that any files you create and modify only in SharePoint online will successfully have their title and URL passed to ServiceNow, but the body will not be searchable and will not display. Secondly, as you will learn by reading further, this integration requires a significant time investment in Microsoft Power Automate development – meaning building and maintaining this integration will require ongoing support from a SharePoint administrator.
When: (i.e. what versions) Washington D.C. and Xanadu.
How: A lot of trial and error and a lot of help. Here’s the details –
To begin, you will need to prepare your instance to use AI Search to receive external content. I followed the docs and in particular the first 3 steps of this very helpful community article. Thanks for this, Max! While this article focuses on connecting two ServiceNow instances rather than SharePoint to ServiceNow, it provided guidance on using the External Content Ingestion API that was either almost or fully impossible to find elsewhere so greatly appreciated.
You will also want to set some sort of connection credentials. I just used Basic Auth so created a user with the ‘Web service access only’ checkbox checked, a password, and the admin role.
Next, you find a contact with SharePoint and Power Automate admin privileges who is as helpful and patient as I was lucky to find.
Here’s what the overall flow looks like. I’ll dig into each step, or as they are technically called, action.
- Create a new flow by going to ‘My flows’ and clicking ‘New flow’. Select ‘Automated cloud flow’ from the dropdown. A modal will appear prompting you to name the flow and select a trigger – for the latter, choose ‘When an item is created or modified’ in SharePoint. Here and in future steps, make sure you are choosing actions that correspond to SharePoint because there are similar options for other Microsoft applications like DevOps, Outlook, and Teams.
2. For the ‘Site Address’, use your specific SharePoint site. The url will likely follow a pattern similar to ‘https://{COMPANY NAME}.sharepoint.com/sites/{DEPARTMENT NAME}’. For the ‘List Name’, you can use the overall bucket of ‘documents’ or you can drill down to any particular folder you want
3. Add a new action to the flow by clicking the plus icon on the canvas. In the new popup on the left, hoose ‘Get file content’ under SharePoint. This step will effectively fetch the content of the file in binary. For the ‘Site Address’ you can use the same as you did above. For the ‘File Identifier’, you will put in your first dynamic content, meaning using the output from a previous action as input on another action. Do this by typing a ‘/’ and selecting ‘Insert dynamic content’. From the popup, type ‘identifier’ in the search and select that option from under the ‘When an item is created or modified’ action.
4. Repeat this exact same process to add the ‘Get file metadata’ action. You will use the same site address and file identifier. This action will get the file type, which is needed later in the flow.
5. The next action you add will be ‘Parse JSON’, which is under the ‘Data operation’ umbrella so not associated with any specific application like SharePoint. For the ‘Content’ you will dynamically select the ‘Body’ from the trigger action (‘When an item is created or modified’). For the schema, you will begin by just putting ‘{}’ to represent an empty JSON object. We will come back to this in a minute.
6. Now you will get to run your first test! In the upper right corner, click ‘Save’ (which you have hopefully already done but if not no time like the present). Then, next to save, select ‘Test’. With any luck, a coworker of yours has modified a file in the time you’ve been building, and you can run the test ‘Automatically’ ‘with a recently used trigger’. Otherwise you can select manually and update a file yourself, using any existing file in the proper SharePoint document tree or creating a new one.
7. After your test runs successfully, you can dig into what happened in the run with each action. You will want to open the JSON Parse action you just created and scroll to view the ‘OUTPUTS’. Copy the body of this, then return to your build by clicking ‘Edit’ in the top right corner.
8. Back in the ‘Parse JSON’ action, under the ‘Schema’ field, there is a link to ‘Use sample payload to generate schema’. Click this, then paste the JSON you just copied from the test run into the popup and click ‘Done’. The ‘Schema’ field should now display a sanitized version of the JSON.
9. Now we get to connect to ServiceNow! Two HTTPS calls are required as laid out in the API documentation. The first, described here, contains the file content in binary for AI Search to digest. In Power Automate, you will fill out the fields in the following way:
- URI: {YOUR INSTANCE}/api/now/{api_version}/ais/external_content/storeContent
- Method: POST
- Headers:
- Accept: application/json
- Content-Type:
- Body: "@base64ToBinary(body('Get_file_content')['$content'])"
- ^note it is very important to paste this value directly in the text field with the quotation marks included.
- Authentication: {whatever format you set up}
I will note here that the value for the Body field here was especially challenging to determine as this formatting isn’t typical to the Power Automate world and there is scant relevant documentation. The solution would not have been found without the assistance of Dan Popescu and Joe’s ongoing patience. A tip I will add from this experience is that If something isn’t working as you might expect, or if you just want to have a deeper level of insight into things, each action offers a ‘Code view’ option where you can see details like what the dynamic values you put in in the ‘Parameters’ view look like under the hood.
10. Next, you will add another ‘Parse JSON’ action to parse the information sent back from ServiceNow, as it contains a key you will need to send in your second HTTPS call. As such, the ‘Content’ field should dynamically grab the body output from the HTTP POST action. For the schema, do the same as earlier by saving and testing the flow, grabbing the output of the action from the run through, and using that sample payload to generate schema.
11. Finally, the last step! This will be another POST action, following the instructions laid out here. Fill out the action as follows
- URI: {YOUR INSTANCE}/api/now/{api_version}/ais/external_content/ingestDocument/{schema_table_name}
- ^the schema table name here is the name of the table you created at the beginning of this process to hold external files
- Method: POST
- Headers:
- Accept: application/json
- Content-Type: application/json
- Authentication: {whatever format you set up}
Body: this you will fill out using various dynamic values within a specific JSON structure. You can start by pasting the below:
{
"content_pointer": ,
"document_id": "EXT_",
"principals": {
"everyone": true
},
"properties": {
"creation_date": "",
"title": “”,
"url": ""
}
}
Then, add the following dynamic content respectively, where the first part of the text represents the action from which you are taking the output and the second represents the name of the output. I will use the names of the actions as I’ve given them as is visible in the very first screenshot. Make sure each value is wrapped in quotation marks.
- Content pointer: ‘Parse Body from HTTP’>’Body’
- Document ID: ‘Parse File’>’ID’
- Note: I added ‘EXT_’ to the beginning of this to indicate that the record created in ServiceNow came from an external source, but I’m not sure that’s strictly necessary.
- Creation date: ‘Parse File’>’Date’
- Title: ‘Parse File’>’{Name}’
- Url: ‘Parse File’>’{Link}’
--------------------------------------------------------------------------------------------------------------
And there you have it! Your file creations and updates should now automatically pass through from SharePoint to ServiceNow (as long as One Drive is at some point involved). This was a very long article, so thanks for reading and I hope it was helpful! If anything seems to be missing or incorrect, please let me know. Happy searching!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-16-2025 08:07 AM
Eliza - have you ever seen www.dtechapps.com/docintegrator/ ?