The CreatorCon Call for Content is officially open! Get started here.

exporting all alerts to Terraform

Wallet
Kilo Contributor

Using Lightstep, what is the best way to export all the alerts from one project and import them into another project?

 

a team has created nearly 200 alerts and don't want to have to do it all again.

 

1 REPLY 1

DanTulovsky
ServiceNow Employee
ServiceNow Employee

I don't have a good answer for you.

You can export each individual alert via the UI.

DanTulovsky_0-1744653316321.png

Which is, of course, painful to do by hand.

You might be able to automate this by:

 

1. Use the publicapi (https://docs.lightstep.com/docs/api-rapidoc-v2#get-/-organization-/projects/-project-/metric_alerts) to get a list of all the alerts you have.  You can, of course, then use some code to convert this api request to terraform if you want to go that route.  Otherwise, grab a list of all the top level `id` parameters, one for each alert.

 

2. Otherwise, you can call the endpoint that converts an alert to terraform. Unfortunately that endpoint is not available via the publicapi, so you have to call it as if you are a web browser.

 

I went and exported one alert and when I did, I looked at the network tab in chrome to see the request it generated.  It's basically the request below, converted to curl. If you put in the `cookie` field as it appears in your browser, you should be able to run this for each of the `id`s you got from the api.

In the snippet below make sure to set the organization, project and the alert_id fields, as well as however many of the redacted fields you might need to get this to work.

 

curl -X GET \
  'https://app.lightstep.com/api/v2/organizations/YOUR_ORGANIZATION/projects/YOUR_PROJECT/alerts/ALERT_ID/hcl_export' \
  -H 'authority: app.lightstep.com' \
  -H 'accept: */*' \
  -H 'accept-encoding: gzip, deflate, br, zstd' \
  -H 'accept-language: en-US,en;q=0.9' \
  -H 'baggage: <redacted>' \
  -H 'content-type: application/json; charset=utf-8' \
  -H 'cookie: <redacted>' \
  -H 'lightstep-org-name: <redacted>' \
  -H 'lightstep-organization-client-id: <redacted>' \
  -H 'lightstep-organization-id: <redacted>' \
  -H 'lightstep-project-client-id: <redacted>' \
  -H 'lightstep-project-name: <redacted>' \
  -H 'priority: <redacted>' \
  -H 'sec-ch-ua: <redacted>' \
  -H 'sec-ch-ua-mobile: <redacted>' \
  -H 'sec-ch-ua-platform: <redacted>' \
  -H 'sec-fetch-dest: <redacted>' \
  -H 'sec-fetch-mode: <redacted>' \
  -H 'sec-fetch-site: <redacted>' \
  -H 'traceparent: <redacted>' \
  -H 'user-agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/135.0.0.0 Safari/537.36'

 

That'll give you a terraform file for each alert.  Although my recommendation would be to organize it better, maybe with a module, or at least a for_each loop where you define the alert resource once and the data that generates the alerts elsewhere, depending on how different all your alerts are.  In the long run that's probably better than just dumping the exported alerts one in each file.

 

Sorry there is no better solution here. I've also wanted this for a long time, but it never quite made the cut.

 

Dan

 

P.S. I *think* this should work, but I didn't test this myself.