Transforming a Comma Separated Field on CSV import

Andrew Bettcher
Kilo Sage

Hi,

Just working out how to ask this question has fried my noodle let alone actually solving it.

We take an extract from a 3rd party system and then push that through a series of transform maps to create or update contracts, assets and CIs in ServiceNow.

The problem is that the 3rd party tool doesn't allow for serialised product items. Now, we want to transfer serial numbers from the other system and make individual, separate asset records in ServiceNow.

I'm going to get a CSV extract with a series of lines on it. One of the columns is going to contain a list of serial numbers separated by a comma. For example, if we've sold 15 XPU servers, there will be a single line for XPU servers with a quantity of 15 and with a list of 15 serial numbers list in the Serial Number column. 

I need the import and transform to take that one line and create 15 asset records all with a unique serial number taken from that list.

I'm sure it can be done but I can't even imagine where to begin. I guess it would need a transform script to populate the import table based on the quantity field?

Any help here would be greatly appreciated.

1 ACCEPTED SOLUTION

Hopefully this will get you going.  It is best practice to name your variables what they actually are so its easy to read and understand.  Without seeing the source table data model its hard to help.

if (source.u_quantity > '1'){
	var serialNumberList = source.u_serial_number.toString().split(",");
	for(var i=0;i < serialNumberList.length;i++){
		var serialNumber = serialNumberList[i];
		var asset = new GlideRecord("alm_hardware");
		asset.serial_number = serialNumber;
		asset.OTHER-FIELD = source.OTHER-SOURCE-FIELD;
		asset.insert();
	}
} else {
	var asset = new GlideRecord("alm_hardware");
	asset.serial_number = source.u_serial_number;
	asset.OTHER-FIELD = source.OTHER-SOURCE-FIELD;
	asset.insert();
)

View solution in original post

9 REPLIES 9

ChrisBurks
Mega Sage

It should be able to be done with a transform script. For example if an transform mapping field is using a script, within the script use the source.<name_of_field> that stores the serial numbers and iterate through that using a GlideRecord script to create the assets. 

 

Andrew Bettcher
Kilo Sage

Hi,

 

I've been able to split the string:

if (source.u_quantity > '1'){

var str = source.u_serial_number;

var arr = str.split(',');

for(var i=0;i<arr.length;i++){

var value = arr[i];

target.comments = (value);
}}

I've got it populating an unused field on the target table as a test. It;s inserting the last value in the string so I know I'm separating out the values OK. I'm just not sure how to iterate through and take the values separately.

I'm gonna get there but if anyone can post some lovely script that will help......;-) 

 

Hopefully this will get you going.  It is best practice to name your variables what they actually are so its easy to read and understand.  Without seeing the source table data model its hard to help.

if (source.u_quantity > '1'){
	var serialNumberList = source.u_serial_number.toString().split(",");
	for(var i=0;i < serialNumberList.length;i++){
		var serialNumber = serialNumberList[i];
		var asset = new GlideRecord("alm_hardware");
		asset.serial_number = serialNumber;
		asset.OTHER-FIELD = source.OTHER-SOURCE-FIELD;
		asset.insert();
	}
} else {
	var asset = new GlideRecord("alm_hardware");
	asset.serial_number = source.u_serial_number;
	asset.OTHER-FIELD = source.OTHER-SOURCE-FIELD;
	asset.insert();
)

Excellent Michael. Thank you.

I also appreciate the advice regarding the naming of variables. I'm self taught and have a background in Service Management so things like this are really useful to me. I'm slowly learning JavaScript using web resources and "on the job" bits and pieces. Someone presents a problem and I piece together a load of scripts from different places to end up with a solution.

When I look back the "solutions" that I put in place 18 months ago I shudder at how inefficient they are and I guess I'll do the same in another 18 months.

I will give this a go and post my results here.