
Igor Kozlov
Tera Expert
Options
- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
on 08-19-2022 12:45 PM
Hi all.
If you need to regularly download your logs file to store for later debug something months old, import to ELK for better search or whatever your reason.
Current platform way to get all your logs is:
System Logs> Utilities> Node Log File Download
Then right click -> Download Logs from Near Nodes -> Select all -> Select day -> Wait until files are ready
Then you can Download now or do it later Node Log File Download History
===
Actually too many manual step when you need logs to be collected daily.
Below are steps you can run to do it automatically
1) Compress yesterday logs to Node Log File Download History and get it ready to download with attached file by calling:
new NodeLogBackupController().scheduleDownloadLatestLog();
Just schedule it for daily run.
2) Use whatever code you like to do following:
- authorize use
- get the latest record from node_log_download_info table
- download you file using download_logs.do endpoint
Below i will provide a code example fro OOB Ubuntu with no additional package installed
Please pay attention. User that will download log:
- should have access to node_log_download_info table
- should NOT have "Web service access only" checkbox set
#first - define your variables. please replace with actual
instance=dev321
id=id_of_enty_defined_at_oauth_entity_table
secret=your_sercet_of_enty_defined_at_oauth_entity_table
username=user_to_authorize
password=password_of_user
#first we need to get token
#first get your token. you can try Basic auth if you like
token=$(curl -k --location --request POST "https://$instance.service-now.com/oauth_token.do" \
--header 'Content-Type: application/x-www-form-urlencoded' \
--data-urlencode "client_id=$id" \
--data-urlencode "client_secret=$secret" \
--data-urlencode 'grant_type=$password' \
--data-urlencode "username=$username" \
--data-urlencode "password=$password" | \
python -c "import sys, json; print json.load(sys.stdin)['access_token']")
dataurl="https://$instance.service-now.com/api/now/table/node_log_download_info?sysparm_limit=1&sysparm_query=ORDERBYDESCsys_updated_on"
#then find latest logs entry
sys_id=$(curl -k --request GET $dataurl \
--header 'Content-Type: application/json' \
-H "Authorization: Bearer $token" | \
python -c "import sys, json; print json.load(sys.stdin)['result'][0]['sys_id']")
zipurl="https://$instance.service-now.com/download_logs.do?sysparm_sys_id=$sys_id"
logdate=`date +"%F"`
curl $zipurl -k -H "Authorization: Bearer $token" --output "logs$logdate".zip
Tested on Rome and SanDiego, but should work in future releases.
By default logs are collect for yesterday date but you can provide any date using
scheduleDownloadForDate method.
Feel free to ask questions.
Labels:
- 648 Views