- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
12-01-2022 09:50 AM - edited 02-24-2023 11:40 AM
< Previous Article | Next Article > | |
Caching data to improve performance | Database Performance: Improving Slow OR and JOIN Queries |
Overview
This guide is written by the ServiceNow Technical Support Performance team (All Articles). We are a global group of experts that help our customers with performance issues. If you have questions about the content of this article we will try to answer them here. However, if you have urgent questions or specific issues, please see the list of resources on our profile page: ServiceNowPerformanceGTS.
If you have developed in ServiceNow for a while, then you have probably come across the scenario where you want to code a comma separated list of items. For example, consider the following code:
WARNING: Do not use! There are two major problems with the below code! |
function getThingsCreatedByUser(userId) {
if (!userId) return null;
var thingsGr = new GlideRecord("u_thing");
thingsGr.addQuery("sys_created_by", userId);
thingsGr.query();
var result = "";
while(thingsGr.next()) {
result = result + thingsGr.sys_id + ",";
}
return (result.length) ? "sys_idIN" + result.substr(0,result.length-1) : null;
}
Code like this might be called by a reference qualifier or a dynamic filter to get a list of things created by a certain user. However, this code has at least two major weaknesses. These could cause your whole ServiceNow node to run low on memory, causing severe performance degradation for all users who are logged in to that node:
- There is no limit on the number of items that can go in the comma separated list.
- It uses String concatenation
In the rest of this article we will explain why the above weaknesses are dangerous and what you can do to avoid them through adopting certain coding best practices.
1. Limit the Size of Lists
One of the most common performance issues that we see is code that generates very large lists of comma separated Strings. This topic has been discussed in a couple of our other articles including Database Performance: Improving Slow OR and JOIN Queries [community article] and Performance Best Practices for Server-side Coding in ServiceNow [community article]. We won't go into detail explaining how to avoid this issue here since that has been done in many other places, including the two articles that we have just linked above. Long story, short, you must somehow limit the max size of your lists. In this article we will show a number of experiments to demonstrate exactly how impactful large comma separated lists can be.
2. Avoid String Concatenation
In multiple tests, we have seen that String concatenation took between 2 to 6 times more memory than an Array of Strings with the exact same number of items. We have also seen that creating lists via String Concatenation takes between 1.5 to 4 times longer than with an Array. We ran tests in both client side Mozilla JavaScript engines as well as the Rhino Javascript engine that runs on ServiceNow's servers.
| String contentation took 2 to 6 times more memory
Aside from our tests, we have also seen "in the wild" that String concatenation can lead to StackOverflowError in Java. This issue is documented in KB1273582. In turn, StackOverflowError can lead to all sorts of unpredictable conditions that can have serious implications like performance degradation or data corruption. This rare defect seems to happen only when Reference Qualifier code is calling a Script Include that builds a large comma separated String using String concatenation.
Test Results
Rhino Engine, JFR Test
Using Java Flight Recorder (JFR) we ran the below two variations of building a comma separated string.
We ran the below code to test 100,000 strings. So string concatenation used 6.14 times more memory!
Result: Size with Array: 21 MB Size with String concatenation: 129 MB |
var arr1 = "";
for (var i = 0; i < 100000; i++) {
arr1 += "some kinda longish string";
}
var res = "sys_idIN" + arr1;
var arr1 = [];
for (var i = 0; i < 100000; i++) {
arr1.push("some kinda longish string");
}
var res = "sys_idIN" + arr1;
Firefox Developer Tools test
We created the following HTML file and alternated between pushing the Test 1 and Test 2 buttons. String concatenation used 2.5 times more memory.
<html>
<head>
<title>Memory Test</title>
<script>
function test1() {
var testString = "test1";
var delimiter = ",";
var result = String(testString);
for (var ia = 0; ia < 100000; ia++) {
result += delimiter.concat(testString);
}
alert(result.length);
var stringCount = result.split().length;
stringCount;
result;
}
function test2() {
var result = "test2";
resultArr = [String(result)];
for (var ia = 0; ia < 100000; ia++) {
resultArr.push(String(result));
}
alert(resultArr.join().length);
var stringCount = resultArr.length;
stringCount;
resultArr;
result;
}
</script>
</head>
<body>
<button name='test1' onClick='test1()'>Test 1</button>
<button name='test2' onClick='test2()'>Test 2</button>
</body>
</html>
Each time we pushed a button we took a memory snapshot. Here is a screenshot from after the fifth snapshot - after pushing the Test 1 button.
Result: Size with Array: 1 MB Size with String concatenation: 2.5 MB |
Rhino Engine, "Scripts - Background" Stress Test
For this final test we wanted to stress test an actual ServiceNow instance to see when it started to tip over. To run the test we ran the following code via the in-app "Scripts - Background" module. We gradually increased the number of Strings in the list and recorded the execution time of each execution until an execution took more than 30 seconds.
var timeExceeded = false;
var maxExcecutionTimeSeconds = 30;
var method1 = function(loops) {
var sw = new GlideStopWatch();
var stringsArr = [];
for (var ia = 0; ia < numArrays; ia++) {
stringsArr[ia] = "";
}
for (var i = 0; i < loops; i++) {
if (sw.getTime() > maxExcecutionTimeSeconds * 1000) {
gs.warn("exceeded " + maxExcecutionTimeSeconds + " seconds while building list of " + loops + " elements.");
timeExceeded = true;
return;
}
for (var ib = 0; ib < numArrays; ib++) {
stringsArr[ib] += "some kinda longish string,";
}
}
gs.info(loops*numArrays + ":" + sw.getTime() + ":" + GlideSystemUtil.usedMemoryMB());
}
var method2 = function(loops) {
var sw = new GlideStopWatch();
var arrayArr = [];
for (var ia = 0; ia < numArrays; ia++) {
arrayArr[ia] = [];
}
for (var i = 0; i < loops; i++) {
if (sw.getTime() > maxExcecutionTimeSeconds * 1000) {
gs.warn("exceeded " + maxExcecutionTimeSeconds + " seconds while building list of " + loops + " elements.");
timeExceeded = true;
return;
}
for (var ib = 0; ib < numArrays; ib++) {
arrayArr[ib].push("some kinda longish string");
}
}
gs.info(loops*numArrays + ":" + sw.getTime() + ":" + GlideSystemUtil.usedMemoryMB());
}
var count = 1;
var factor = 20000;
var numArrays = 20;
var methodToCall = 2;
gs.info("factor: " + factor + ", arrays: " + numArrays + ", method: " + (methodToCall==1?"String concatenation":"Array"));
while(count++ < 100 && !timeExceeded) {
if (methodToCall == 1)
method1(count*factor);
else
method2(count*factor);
}
ServiceNow has a built-in protection against objects getting too large (see KB0749085 [Official ServiceNow Support site]). It doesn't kick in in all circumstances, but it does when you are running in Scripts - Background. When ServiceNow detects that an object is above the maximum size, it stops execution and throws an exception similar to the following in the logs:
"String object would exceed maximum permitted size of 33554432"
To get around this restraint we created multiple objects so that their combined size could exceed the limit.
The results are pretty interesting. Using an Array the instance could build 40 million 25 character strings before it reached 30 seconds of execution time. Using String concatenation the code reached 30 seconds of execution time while building a list of 10 million strings. Also, notice that the String concatenation method shows an exponential curve near the end of the test - potentially indicating that the instance was running out of memory resources. On the other hand, note that the execution time trend for the Array method stayed linear - even all the way up to 40 million elements. Finally, note that at 10 million elements the Array method completed in just 7.5 seconds while the String concatenation method took over 30 seconds - 4 times slower!
Results with String concatenation: Elements added to list in 30 seconds: less than 10M Time to add 10 million elements: over 30 seconds | Results with Array: Elements added to list in 30 seconds: 40M Time to add 10 million elements: 7.5 seconds |
Left Y-axis: execution time in milliseconds
Right Y-axis: Memory in-use (MB)
X-Axis: Number of elements in list (grid lines every 2M elements)
Using String concatenation up to 30 seconds (code cancelled after building about 9.5 million elements)
Using an Array up to 10 million elements
Using an Array up to 30 seconds (40 million elements)
Summary
As you can see from these tests, the creation of large lists in code can easily cause a performance headache. If at all possible, use programming strategies that do not require the creation of large lists of items in memory. When you do need to use large lists, set limits to ensure your lists sizes stay reasonably small and use Arrays instead of String concatenation to keep your memory usage as efficient as possible. Ensure that all engineers writing code in your environment are being memory conscious by avoiding excessive lists whenever possible.
Best regards,
Your ServiceNow Global Technical Support Performance Team
< Previous Article | Next Article > | |
Caching data to improve performance | Database Performance: Improving Slow OR and JOIN Queries |
- 4,796 Views

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content

- Mark as Read
- Mark as New
- Bookmark
- Permalink
- Report Inappropriate Content
Incredible, I thought the opposite was better, to concatenate string than push arrays. Thanks.