Millions of DUPLICATE_PAYLOAD_RECORDS discovery errors daily from probes and patterns - any way to squelch?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-17-2018 06:48 PM
We have around 6000 servers being discovered in our networks on a daily basis.
Discovery is generating about 12M error log entries every day with useless error messages about payloads. I think this is new in Kingston as I do not remember so many daily errors prior to Kingston.
For example, a common error is DUPLICATE_PAYLOAD_RECORDS for everything from DNS aliases to tomcat connectors, with the useless message to remove duplicate items from the payload:
identification_engine : DUPLICATE_PAYLOAD_RECORDS Found duplicate items in the payload (index 87 and 88), using className [cmdb_ci_dns_alias] and fields [name]. Remove duplicate items from the payload
These errors are often proceeded by at least one long message with the output from the probe or pattern with :no thrown error at the end.
I've been unable to determine if these errors are benign as the discovered CIs look ok to me.
Has anyone had similar issues, and if so, is there anything to fix??? Or how can I disable these errors? Debug properties are false, and logging level is Information.
I don't think we have a performance issue yet, but 2M useless log updates daily cannot be a good thing.
- Labels:
-
Discovery
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-18-2018 06:11 AM
Paul,
I would engage customer support on this. Typically, duplicate records occur for one of two reasons:
1) you are importing data from multiple sources and it is not coalesced correctly
2) you have probes or sensors or patterns running on the same class. This will yield duplicates.
Customer support will be able to help you if you don't believe it is one of these two things.
Regards,
Scott Whitten
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
10-18-2018 07:40 AM
Hey Scott -
Thanks for the reply. I do also have a HI ticket in, and I usually also ask the community to see if someone else has experienced similar issues.
This is likely not an import issue since many of the errors are about tomcat connectors and dns aliases, which are discovered and not also imported.
It may well be that we've got probes and patterns running but its not obvious to me from the discovery logs.
I've not seen you around at the ServiceNow events lately ... I'll be at the Waltham office next week for the CMDB meeting if you're still in that office.
Paul
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2020 01:02 PM
Hello Paul,
Did you solve this issue? I am also having the same issue. Please let me know if you have any resolution.
Thanks,
Achu
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
01-07-2020 01:16 PM
ServiceNow gave us this solution, which has quieted this issue but not completely solved it.
Steps for solution : 1) In the tomcat pattern , add a new step under the step "66.3. Filter empty records from tomcat_connector" as shown in the screen shot
2) set the name of the step as "remove duplicate tomcat connectors"
3) Set operation as "Set Parameter value" Value : EVAL(javascript: var tableWithoutDuplicates = '';tableWithoutDuplicates = DuplicateRemover.removeDuplicates(${cmdb_ci_tomcat_connector},["port"]);CTX.setAttribute("cmdb_ci_tomcat_connector", tableWithoutDuplicates); ) Name : $tmp
4) Save and publish the pattern Run a discovery and you should not see the duplicate payload records error anymore