Max Grabner
ServiceNow Employee
ServiceNow Employee

In this article, I will describe how to integrate and index external content with AI Search into your ServiceNow instance without replicating your external documents into ServiceNow

 

The big advantage of this approach over an integration of external content into the Knowledge base is, that NO KB articles will be created in Servicenow.

There will be no redundancy of articles in the external source and the target source.

 

For the following example, I indexed a knowledge article which exists on ServiceNow Instance B (Rome release) on another ServiceNow Instance A (SanDiego release)

Step 1: Base Setup in target instance A

Make sure AI Search is enabled and everything is installed. 

You should see a Module "AI Search -> External Content" 

Step 2: Create Schema table in target instance A

Navigate to "AI Search -> External Content -> Create Schema" 

Give your Schema a name and click Submit. This will create a table that extends v_ais_external_search_content

The OOTB table should be sufficient, but if you want to include more than just the title to the indexed source, you can add additional columns (in my example, I added "u_body" with type string)

Step 3: Create Indexed Source and Search Source

Navigate to "AI Search -> AI Search Index -> Indexed Sources" and click New.

Give it a name and select the newly created table as source.

If you have created additional fields on your schema table, create a field mapping for your table in the indexed source (see screenshot).

find_real_file.png

Create a Search Source and add the Indexed Source to the newly created Search Source

Finally, you can use this Search Source in any Search Profile.

In my example, I added it to the "ESC Portal Default Search Profile".

STEP 4: Create REST integration from source instance B

Indexing of external Search Content via AI Search is always based on a data PUSH. So in my example, I created a REST call (by using IntegrationHub/Flow Designer) to push the data to instance A.

In your integration, you can use any other method if required. Like a python script or curl.

Step 4.0: Create a Connection and Credentials to use for Flow Actions

In my example, I created a connection that points to target instance A and a credentials record with Base authentication to log into the instance.

In your integration you can use other authentication mechanisms of your choice.

Step 4.1: Create a flow action with a REST step to create binary content

Use the following REST API (External Content Ingestion API) to store binary content as a content object in AI Search:

api/now/v2/ais/external_content/storeContent

 This will create a binary content object and return a SysId which can be used in the next step to send external documents to AI Search.

The content-type can be anything from the following depending on the documents that need to be indexed. I created an input variable for the content-type.

      • application/msword
      • application/octet-stream
      • application/pdf
      • application/vnd.ms-excel
      • application/vnd.ms-powerpoint
      • application/vnd.ms-powerpoint.presentation.macroenabled.12
      • application/vnd.openxmlformats-officedocument.presentationml.presentation
      • application/vnd.openxmlformats-officedocument.presentationml.template
      • application/vnd.openxmlformats-officedocument.spreadsheetml.sheet
      • application/vnd.openxmlformats-officedocument.wordprocessingml.document
      • application/vnd.openxmlformats-officedocument.wordprocessingml.template
      • text/html
      • text/plain

Add a JSON parser step to parse the response body and extract the sysId of the binary object and map this to an outputs variable to use it in the next flow action.

 find_real_file.png

Step 4.2: Create a flow action with a REST step to ingest documents.

Use the following REST API (External Content Ingestion API) to send documents to the previously created binary content object:

api/now/v2/ais/external_content/ingestDocument/u_xxxxxx

 Create the following inputs:

find_real_file.png

Hint: "u_body" is an example for an additional field. See Step 3!

Configure a REST Step and map your inputs to the Request Body.

  • content_pointer -> sysId of binary content which was created in step 4.1
  • document_is -> unique identifier of the document. When there already exists a document with this identifier, it will be overwritten.
  • principals -> object with security settings
  • properties -> object with the fields that will be indexed

find_real_file.png

Step 5: Create Flow to run both actions on source instance B

find_real_file.png

Step 6: Try it out!

  1. Create a KB-Article on your source instance B
  2. Open your target instance A and go to a place where your Search Profile which was created in Step 3 is used. In my example it is used in the ESC Portal.
  3. Search for any term that is part of your indexed source (title or in my example "u_body")
  4. See the Result in your target instance A.
  5. If you click on the result, you will be redirected to the document (which was handed over with the url property. In my Case the KB article in the Service Portal of the source instance B)

 

Article in source instance B

find_real_file.png

 

ESC Portal AI Search in target instance A

(This article DOES NOT exist in instance A!)

find_real_file.png

Redirection to Service Portal in source instance B

find_real_file.png

 

Comments
shun6
Giga Sage

Hi @Max Grabner ,

It's nice article!

I'm also trying to index external content into ServiceNow and make it searchable by AI Search.

I've already created the external content schema table and search profile etc:

When I execute external_content ingestDocument api like you, it's succeeded but external content schema table is empty and AI Search don't catch any result. 

So, which table can I see the external content infomation indexed by external_content ingestDocument api and make it searchable by AI Search?

Feddy
Kilo Sage

@Max Grabner 

Hi Max, 

Thanks for this article. Can you provide some input on following points?

1. Our external source is not service-now instance

2. how can we achieve external search using integration hub

3. How the flow action will trigger?


Any input on the above will be more useful

Max Grabner
ServiceNow Employee
ServiceNow Employee

Hi @shun ,

the external content schema table stays empty. No records will be created during ingestion. It is just a schema so that servicenow knows how to map columns.

As stated at the top:

The big advantage of this approach over an integration of external content into the Knowledge base is, that NO KB articles will be created in Servicenow.

There will be no redundancy of articles in the external source and the target source.

So there is no data stored as records so that you can "see" the actual records. 

 

What is there source with the external content where you run the ServiceNow ingestDocument Api? Is it also a ServiceNow Instance? Or Sharepoint or a Website? Is there any Error Message?

Did you also create the binary content before with the api/now/v2/ais/external_content/storeContent API?

I would need more infos to be able to troubleshoot.

ponranjitham
Tera Contributor

Hi @Max Grabner ,

I'm also trying to index external content into ServiceNow and to run ServiceNow ingestDocument Api from Website. 

I've already created the external content schema table and search profile etc:

Please help to make it searchable by AI Search.

Max Grabner
ServiceNow Employee
ServiceNow Employee

Have you followed my example?
What is your issue?

ponranjitham
Tera Contributor

Hi @Max Grabner ,

Please refer below link for our existing search source setup (Zing)

https://support.quickhelp.com/support/solutions/articles/6000220804-quickhelp-servicenow-search-integration

I am not sure how can I handle the same URL and configure it for AI search to index it.

Created Schema table and Indexed Search. Having issues with flow actions.


Any help on this would be much appreciated

Max Grabner
ServiceNow Employee
ServiceNow Employee

Hi @ponranjitham ,

Since you unfortunately don't give any details about your problem, I can only guess.

I am guessing, you try to index search content from "Quickhelp" within AI Search?

 

Please read my article carefully again: Indexing of external Search Content via AI Search is always based on a data PUSH.

This means, your integration will be created on Quickhelp Side.

 

I am neither member of support, nor familiar with "Quickhelp".

You should reach out to Quickhelp to get support how the data structure.

If you have any specific questions, I am happy to help. But I would need detailed conext about your problem and where you are stuck at.

MeghaB
Tera Contributor

Hi @Max Grabner ,

My Knowledge Articles of ServiceNow instance B are searchable in ServiceNow instance A. The only issue is:

Only the Short Description of the KB(s) are getting displayed through AI Search unlike the KB(s) which exist in instance A- for which the article name, author, last updated date, a short body is also visible.

 

I want to make these meta data visible for the KB articles of instance B, searched through AI Search, in instance A.

 

I have created an Entity View Action Mapping View Config, that references my Indexed Source, and a View Template.

 

My question is:-

 

1. What will be the custom field names  apart from 'ai_search_teaser_title,ai_search_teaser_text, document_index', if I want to display KB short description, Kb Number, KB Author, KB Number, KB Last Updated on?

 

2. What will be the key attributes for 'Kb Number, KB Author, KB Number, KB Last Updated on' to map in View Template?

 

"mappings": {
"textHeaderLabelOne": "topic",
"textHeaderLabelTwo": "number",
"textHeaderLabelThree": "sys_updated_on",
"title": "ai_search_teaser_title",
"summary": "ai_search_teaser_text",
"ariaLabel": "short_description"
},

Page22
Tera Contributor

Hi @Max Grabner,

I understand that Indexing of external Search Content via AI Search is always based on a data PUSH.

 

Is there an article or guideline to index external Search content via AI Search from a Website?

Sravan Kumar Ka
Tera Contributor

Hi @Max Grabner 

 

Please let me know if the External Schema table is considered as a custom table.

 

Thank you.

Max Grabner
ServiceNow Employee
ServiceNow Employee

@MeghaB

 

You are on the wrong track.

As you can see in my screenshots, I also added the knowledge article body to the indexed source (u_body)

Follow my instructions. You don't have to create a EVAM config view.

 

These are the steps (taken from my initial article) highlighted in red where you need to make adjustments to make additional fields searchable (in my example the Article Body: kb_knowledge.text)

 

 

 

Step 2: Create Schema table in target instance A

Navigate to "AI Search -> External Content -> Create Schema" 

Give your Schema a name and click Submit. This will create a table that extends v_ais_external_search_content

The OOTB table should be sufficient, but if you want to include more than just the title to the indexed source, you can add additional columns (in my example, I added "u_body" with type string)

 

Step 3: Create Indexed Source and Search Source

Navigate to "AI Search -> AI Search Index -> Indexed Sources" and click New.

Give it a name and select the newly created table as source.

If you have created additional fields on your schema table, create a field mapping for your table in the indexed source (see screenshot).

 

Step 4.2: Create a flow action with a REST step to ingest documents.

....

Hint: "u_body" is an example for an additional field. See Step 3!

Configure a REST Step and map your inputs to the Request Body.

  • content_pointer -> sysId of binary content which was created in step 4.1
  • document_is -> unique identifier of the document. When there already exists a document with this identifier, it will be overwritten.
  • principals -> object with security settings
  • properties -> object with the fields that will be indexed

Step 6: Try it out!

  1. Create a KB-Article on your source instance B
  2. Open your target instance A and go to a place where your Search Profile which was created in Step 3 is used. In my example it is used in the ESC Portal.
  3. Search for any term that is part of your indexed source (title or in my example "u_body")
  4. See the Result in your target instance A.
  5. If you click on the result, you will be redirected to the document (which was handed over with the url property. In my Case the KB article in the Service Portal of the source instance B)

 

Max Grabner
ServiceNow Employee
ServiceNow Employee

@Page22 

You can check External Content in AI Search .

There are three ways to make external content searchable. Maybe there is something suitable for you:

  •  Search connector application
    • Easiest way with pre-built applications from Store (i.e. SharePoint Online Search Connector)
  • AI Search spoke
    • Low-code version of External Content REST API. Might be easier for some use cases
  • External Content REST API (this article!)
    • Most flexible, but also most complex implementation.

I don't know the website you want to make searchable, but basically, you must use the API to send content from your website to Servicenow for digestion on a regular basis. (What I did on Instance B)

 

Max Grabner
ServiceNow Employee
ServiceNow Employee

@Sravan Kumar Ka 

I assume you are interested in licencing implications. I would like to encourage you to address this to your ServiceNow Account team.

Page22
Tera Contributor

@Max Grabner was step "4.1: Create a flow action with a REST step to create binary content" completed in Instance B or A?

Max Grabner
ServiceNow Employee
ServiceNow Employee

@Page22 , 

everything in step 4 (so 4.0, 4.1 and 4.2) was created on the remote instance B

 

Carlo Jimenez
Tera Contributor

Hi @Max Grabner

 

Is there a way for us to delete indexed sources? I have indexed sources with the wrong indices, and we wanted to clean up these records to do re-indexing with a new set of indices. 

 

Thanks!

Carlo

Max Grabner
ServiceNow Employee
ServiceNow Employee

@Carlo Jimenez , of course.

You can use the delete API with a query that identifies the indexed content you want to delete.

 

External Content Ingestion API – DELETE /ais/external_content/deleteByQuery/{schema_table_name}

https://docs.servicenow.com/bundle/vancouver-api-reference/page/integrate/inbound-rest/concept/exter... 

Swetha__S
Tera Contributor

Hi  @Max Grabner ,

 

Thanks for the article. Could you please share the details for JSON Parser step as part of the action - AIS create content object mentioned in Step 4.1 : Create a flow with REST step to create binary content.

 

Thanks,

Swetha

Eric Smith
Tera Expert

thanks! 

Swetha__S
Tera Contributor

Hi @Max Grabner ,

 

Following all the steps in this article, I was able to index the KB articles and fetch the search results but in the search results content the HTML tags are also visible. Please let me know how to avoid the HTML tags in the external search results.

Thanks,

Swetha

NaveenDK
Tera Contributor

Hi @Max Grabner ,

 

Thanks for this article. I'm able to perform the AI search on Portal and able to fetch external KB article.

I'm using same search profile and same search source in Virtual agent Search Context Configuration but its not searching from the external source instead it is looking for data in internal KB table.

Kindly help me here to perform search same as in portal.

 

Thanks,

Naveen

savaliya
Tera Expert

Hello @Swetha__S , Did you manage to follow the JSON Parser step. I am experiencing the Error (JsonParserOperation failed: Extraction of Complex Object failed: JsonStreamParser[0]: JSON must be an object or an array: ). 

savaliya_0-1721733379422.png

@Max Grabner 

-Lalit

SWETHA S1
Tera Contributor

Hi @savaliya ,

Please refer the below screenshot to parse JSON and obtain the value for result.

SWETHAS1_0-1722425229698.png

Thanks,
Swetha



subbaraop
Tera Contributor

// Define the start date in 'yyyy-MM-dd HH:mm:ss' format
var startDate = '2023-09-01 00:00:00'; // Modify this with your designated start time

// Define an initial end date (6 months from the start date)
var endDate = new GlideDateTime(startDate);
endDate.addMonths(6);

// Function to count records between start and end dates
function getRecordCountBetweenDates(startDate, endDate) {
var gr = new GlideRecord('incident');
gr.addQuery('sys_created_on', '>=', startDate);
gr.addQuery('sys_created_on', '<=', endDate);
gr.query();
return gr.getRowCount();
}

// Function to get the next end date by adding more months
function extendEndDate(endDate, months) {
var newEndDate = new GlideDateTime(endDate);
newEndDate.addMonths(months);
return newEndDate;
}

// Iterate to find the correct date range where record count is around 500,000
var recordCount = 0;
while (recordCount < 500000) {
recordCount = getRecordCountBetweenDates(startDate, endDate);
gs.print('Checking records between ' + startDate + ' and ' + endDate.getDisplayValue() + ': ' + recordCount + ' records found.');

if (recordCount < 500000) {
// If the count is less than 500,000, extend the end date by another 6 months
endDate = extendEndDate(endDate, 6);
} else {
// If record count exceeds 500,000, start narrowing down the date range
break;
}
}

// Now narrow down the date range to find the exact point where count is closest to 500,000
var lowDate = new GlideDateTime(endDate); // Start with the current endDate
lowDate.addMonths(-6); // Go back to the start of this last range
var highDate = new GlideDateTime(endDate); // The last known endDate

// Binary search to find the exact end date where the count is around 500,000
while (lowDate.compareTo(highDate) < 0) {
var midDate = new GlideDateTime(lowDate);
midDate.addDays((highDate.getDayOfYear() - lowDate.getDayOfYear()) / 2);

recordCount = getRecordCountBetweenDates(startDate, midDate);
gs.print('Checking records between ' + startDate + ' and ' + midDate.getDisplayValue() + ': ' + recordCount + ' records found.');

if (recordCount < 500000) {
lowDate = midDate; // Move the lower bound up
} else {
highDate = midDate; // Move the upper bound down
}
}

gs.print('Exact end date for around 500,000 records: ' + highDate.getDisplayValue());

subbaraop
Tera Contributor

// Define the start date in 'yyyy-MM-dd HH:mm:ss' format
var startDate = '2023-09-01 00:00:00'; // Modify this with your designated start time

// Define an initial end date (6 months from the start date)
var initialEndDate = new GlideDateTime(startDate);
initialEndDate.addMonths(6);

// Function to count records between start and end dates (manual count)
function getRecordCountBetweenDates(startDate, endDate) {
var gr = new GlideRecord('incident');
gr.addQuery('sys_created_on', '>=', startDate);
gr.addQuery('sys_created_on', '<=', endDate);
gr.query();

var count = 0;
while (gr.next()) {
count++; // Manually count the records
}
return count;
}

// Function to keep adding months until the count is around 500k
function getEndDateWithTargetCount(startDate, initialEndDate, monthsToAdd, targetCount) {
var endDate = new GlideDateTime(initialEndDate);
var recordCount = getRecordCountBetweenDates(startDate, endDate);

// Keep adding months until we hit the target count
while (recordCount < targetCount) {
gs.print('Checking records between ' + startDate + ' and ' + endDate.getDisplayValue() + ': ' + recordCount + ' records found.');

// Add the defined number of months (e.g., 6 months) to the end date
endDate.addMonths(monthsToAdd);

// Recalculate the record count after extending the date
recordCount = getRecordCountBetweenDates(startDate, endDate);
}

gs.print('Reached target record count of ' + targetCount + ' at end date: ' + endDate.getDisplayValue());
return endDate; // Return the end date where we hit the target record count
}

// Set your target record count to 500k and define how many months to add per iteration (e.g., 6 months)
var targetRecordCount = 500000;
var monthsToAddPerIteration = 6;

// Call the function to get the end date where the count reaches 500k
var finalEndDate = getEndDateWithTargetCount(startDate, initialEndDate, monthsToAddPerIteration, targetRecordCount);

gs.print('Final end date where record count is around 500,000: ' + finalEndDate.getDisplayValue());

subbaraop
Tera Contributor

// Define the start date in 'yyyy-MM-dd HH:mm:ss' format
var startDate = '2023-09-01 00:00:00'; // Modify this with your designated start time

// Define an initial end date (6 months from the start date)
var initialEndDate = new GlideDateTime(startDate);
initialEndDate.addMonths(6);

// Function to count records between start and end dates (manual count)
function getRecordCountBetweenDates(startDate, endDate) {
var gr = new GlideRecord('incident');
gr.addQuery('sys_created_on', '>=', startDate);
gr.addQuery('sys_created_on', '<=', endDate);
gr.query();

var count = 0;
while (gr.next()) {
count++; // Manually count the records
}
return count;
}

// Function to fine-tune down to the hour after exceeding 500k
function getEndDateWithTargetCount(startDate, initialEndDate, monthsToAdd, targetCount) {
var endDate = new GlideDateTime(initialEndDate);
var recordCount = getRecordCountBetweenDates(startDate, endDate);

// Keep adding months until we hit or exceed the target count
while (recordCount < targetCount) {
gs.print('Checking records between ' + startDate + ' and ' + endDate.getDisplayValue() + ': ' + recordCount + ' records found.');

// Add the defined number of months (e.g., 6 months) to the end date
endDate.addMonths(monthsToAdd);

// Recalculate the record count after extending the date
recordCount = getRecordCountBetweenDates(startDate, endDate);
}

// Now that we are past 500k, we need to narrow down to within one hour
gs.print('Reached or exceeded target record count of ' + targetCount + ' at end date: ' + endDate.getDisplayValue());

// Go back one month to fine-tune within the range
var fineTuneEndDate = new GlideDateTime(endDate);
fineTuneEndDate.addMonths(-monthsToAdd); // Move back by the last added number of months (6 in this case)

// Narrow it down by adding one day at a time until we exceed or reach the target
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
while (recordCount < targetCount) {
fineTuneEndDate.addDays(1); // Add one day at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
gs.print('Checking records between ' + startDate + ' and ' + fineTuneEndDate.getDisplayValue() + ': ' + recordCount + ' records found.');
}

// Now that we've narrowed it down to a specific day, fine-tune to the exact hour
fineTuneEndDate.addDays(-1); // Step back by one day, since we may have exceeded the count
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

// Now add hour by hour until we reach the target count
while (recordCount < targetCount) {
fineTuneEndDate.addHours(1); // Add one hour at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
gs.print('Checking records between ' + startDate + ' and ' + fineTuneEndDate.getDisplayValue() + ': ' + recordCount + ' records found.');
}

// Return the end date within one hour where we have close to 500k records
gs.print('Final end date within 1 hour of 500k records: ' + fineTuneEndDate.getDisplayValue());
return fineTuneEndDate;
}

// Set your target record count to 500k and define how many months to add per iteration (e.g., 6 months)
var targetRecordCount = 500000;
var monthsToAddPerIteration = 6;

// Call the function to get the end date where the count is within 1 hour of 500k
var finalEndDate = getEndDateWithTargetCount(startDate, initialEndDate, monthsToAddPerIteration, targetRecordCount);

gs.print('Final end date where record count is within 1 hour of 500,000: ' + finalEndDate.getDisplayValue());

subbaraop
Tera Contributor

var startDate = '2023-08-13 00:00:00'; // Modify this with your designated start time
var initialEndDate = new GlideDateTime(startDate);
initialEndDate.addMonths(6);
var targetRecordCount = 100000;
var monthsToAddPerIteration = 6;
var finalEndDate = getEndDateWithTargetCount(startDate, initialEndDate, monthsToAddPerIteration, targetRecordCount);

gs.print(finalEndDate);

 


function getEndDateWithTargetCount(startDate, initialEndDate, monthsToAdd, targetCount) {
var endDate = new GlideDateTime(initialEndDate);
var prevendDate= new GlideDateTime(endDate);
var recordCount = getRecordCountBetweenDates(startDate, endDate);
var currentDate = new GlideDateTime();

while (recordCount < targetCount) {
gs.print('Checking records between ' + startDate + ' and ' + endDate.getDisplayValue() + ': ' + recordCount + ' records found.');

if (currentDate.compareTo(endDate) < 0 && recordCount < targetCount){
return currentDate;
}
if (currentDate.compareTo(endDate) < 0 && recordCount > targetCount){
break;
}

endDate.addMonths(monthsToAdd);
recordCount = getRecordCountBetweenDates(startDate, endDate);
}


var fineTuneEndDate = new GlideDateTime(endDate);

fineTuneEndDate.addMonths(-3);

recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
while (recordCount < targetCount) {

fineTuneEndDate.addMonths(3); // Add 3 month at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}

 


fineTuneEndDate.addMonths(-2);

recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
while (recordCount < targetCount) {

fineTuneEndDate.addMonths(1); // Add one month at a time


if (currentDate.compareTo(endDate)){

return currentDate;

}
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}


fineTuneEndDate.addMonths(-1);
fineTuneEndDate.addDays(7);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);


while (recordCount < targetCount) {
fineTuneEndDate.addDays(7); // Add 7 days at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}


fineTuneEndDate.addDays(-7);
fineTuneEndDate.addDays(1);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);


while (recordCount < targetCount) {
fineTuneEndDate.addDays(1); // Add one day at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}


fineTuneEndDate.addDays(-1);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);
var hours = 60*60;
fineTuneEndDate.addSeconds(hours);


while (recordCount < targetCount) {
fineTuneEndDate.addSeconds(hours); // Add one hour at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}

fineTuneEndDate.addSeconds(-hours);
var mins30 = 60*30;
fineTuneEndDate.addSeconds(mins30);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);


while (recordCount < targetCount) {
fineTuneEndDate.addSeconds(mins30); // Add one 30 mins at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}

fineTuneEndDate.addSeconds(-mins30);
var mins10 = 60*10;
fineTuneEndDate.addSeconds(mins10);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

 

while (recordCount < targetCount) {
fineTuneEndDate.addSeconds(mins10); // Add 10 at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}


fineTuneEndDate.addSeconds(-mins10);
var mins5 = 60*5;
fineTuneEndDate.addSeconds(mins5);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

 

while (recordCount < targetCount) {
fineTuneEndDate.addSeconds(mins5); // Add 5 mins at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}


fineTuneEndDate.addSeconds(-mins5);
var min = 60;
fineTuneEndDate.addSeconds(min);
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

 

while (recordCount < targetCount) {
fineTuneEndDate.addSeconds(min); // Add one min at a time
recordCount = getRecordCountBetweenDates(startDate, fineTuneEndDate);

}

 

return fineTuneEndDate;
}


function getRecordCountBetweenDates(startDate, endDate) {
var gr = new GlideRecord('incident');
gr.addQuery('sys_created_on', '>=', startDate);
gr.addQuery('sys_created_on', '<=', endDate);
gr.query();
return gr.getRowCount();
}

subbaraop
Tera Contributor

var DateRangeHelper = Class.create();
DateRangeHelper.prototype = {
initialize: function() {},

// Method to get the final end date based on table, field, startDate, and target record count
getEndDateWithTargetCount: function(table, field, startDate, initialEndDate, monthsToAdd, targetCount) {
var endDate = new GlideDateTime(initialEndDate);
var recordCount = this.getRecordCountBetweenDates(table, field, startDate, endDate);
var currentDate = new GlideDateTime();

// Keep adding months until the target count is hit or exceeded
while (recordCount < targetCount && endDate.compareTo(currentDate) <= 0) {
gs.print('Checking records between ' + startDate + ' and ' + endDate.getDisplayValue() + ': ' + recordCount + ' records found.');
endDate.addMonths(monthsToAdd);
recordCount = this.getRecordCountBetweenDates(table, field, startDate, endDate);
}

// Fine-tuning after the target count is exceeded
var fineTuneEndDate = new GlideDateTime(endDate);
fineTuneEndDate.addMonths(-3); // Go back 3 months and start fine-tuning
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addMonths(1); // Add one month at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

// Fine-tuning by days, hours, and minutes
fineTuneEndDate.addMonths(-1);
fineTuneEndDate.addDays(7);
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addDays(7); // Add 7 days at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addDays(-7);
fineTuneEndDate.addDays(1);
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addDays(1); // Add one day at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addDays(-1);
fineTuneEndDate.addSeconds(60 * 60); // Add one hour
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addSeconds(60 * 60); // Add one hour at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addSeconds(-60 * 60);
fineTuneEndDate.addSeconds(60 * 30); // Add 30 mins
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addSeconds(60 * 30); // Add 30 minutes at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addSeconds(-60 * 30);
fineTuneEndDate.addSeconds(60 * 10); // Add 10 minutes
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addSeconds(60 * 10); // Add 10 minutes at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addSeconds(-60 * 10);
fineTuneEndDate.addSeconds(60 * 5); // Add 5 minutes
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addSeconds(60 * 5); // Add 5 minutes at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

fineTuneEndDate.addSeconds(-60 * 5);
fineTuneEndDate.addSeconds(60); // Add one minute
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);

while (recordCount < targetCount && fineTuneEndDate.compareTo(currentDate) <= 0) {
fineTuneEndDate.addSeconds(60); // Add one minute at a time
recordCount = this.getRecordCountBetweenDates(table, field, startDate, fineTuneEndDate);
}

return fineTuneEndDate;
},

// Helper function to get the record count between two dates
getRecordCountBetweenDates: function(table, field, startDate, endDate) {
var gr = new GlideRecord(table);
gr.addQuery(field, '>=', startDate);
gr.addQuery(field, '<=', endDate);
gr.query();
return gr.getRowCount();
},

type: 'DateRangeHelper'
};

Version history
Last update:
‎02-15-2022 12:55 AM
Updated by: