*** WARNING *** Maximum per transaction log statements (200000) reached. Suppressing further logging
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎04-27-2022 09:31 AM
Hello,
We have a scheduled script, which takes 3 days to finish (it was working fine on Quebec with no issues).
Since we have upgraded to San Diego, after 1 or two hours the script stops to write to system log (messages coming from gs.info), so we noticed that in system logs we have the following message just after it stops writing:
*** WARNING *** Maximum per transaction log statements (200000) reached. Suppressing further logging
And we can see that the script is running through (active transactions and event log)
When counting the info logs printed by the script through (gs.info), it doesn't exceed the 1000 record.
Can anyone tell how to check if the script has really generated 200000 log statements?
Thanks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎07-10-2022 08:28 AM
I guess, It's actually referring to the transaction logs( System log --> Transactions)
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎06-16-2024 07:46 AM
This KB should help -
https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0656906
The limit of 200,000 applies to all logs including node logs. The job will generate node logs for slow queries or any gs.print statements as well besides the gs.log statement you have coded.
I ran into an identical issue where my job had barely put 200 syslog statements and it flagged this warning in the node logs.