
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2017 04:13 PM
I'm exporting data from SQL Server Management Studio to csv files and then importing them to ServiceNow by way of "load data", create a new import set row and data source. Everything is working great except for certain files that contain additional commas. I'm getting an error when creating a new data source indicating that it expected x number columns but found y number of columns. I need to figure out how to resolve this. Here are more details.
1. I know it's being caused by the additional commas that are found inside some of the fields. These fields that contain these, however, are enclosed inside double quotes which is valid csv. I'm baffled why SN does not allow this.
2. I can temporarily open in Excel, search for commas and replace with spaces. The error goes away so I know the root cause is the extra commas. I cannot use this Excel find/replace as a permanent solution as I have many tables in which to do this for.
Here is a simple example showing what my valid csv file looks like.
header1, header2, header3, header4
data1, "data2 plus, comma in double quotes", data3, data4 <-- ServiceNow doesn't like this.
data5, data6, data7, "data8 plus, comma in double quotes" <-- ServiceNow doesn't like this.
p.s., the above is a valid format according to the Standard RFC 4180
Solved! Go to Solution.
- Labels:
-
Scripting and Coding

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2017 06:43 PM
Hi Joel,
With your prodding, I was able to figure out the root cause of my problem. Here are the details. Hopefully this may help someone else in the future. Thanks for spending time on this.
I did some deep analysis of each record and looked for any inconsistencies in the records. I used Notepad++ and turned on show all symbols. I found there are about 3 records where the content of the data field actually contains CR/LF (carriage return and line feeds) embedded in the data. Therefore, these records actually exist on more than one line in the file. This causes SN to say "Hey!, I found records that don't have the correct column count dummy!"
This was an elusive problem since I have thousands of rows. Maybe there's a tool out there than can find and notify you have csv inconsistencies easier than how I did it. Anyway, at least I know my problem. Now, I have to figure out how to hand incoming garbage data from a SQL server database.
Have a good night.
Todd

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2017 06:43 PM
Hi Joel,
With your prodding, I was able to figure out the root cause of my problem. Here are the details. Hopefully this may help someone else in the future. Thanks for spending time on this.
I did some deep analysis of each record and looked for any inconsistencies in the records. I used Notepad++ and turned on show all symbols. I found there are about 3 records where the content of the data field actually contains CR/LF (carriage return and line feeds) embedded in the data. Therefore, these records actually exist on more than one line in the file. This causes SN to say "Hey!, I found records that don't have the correct column count dummy!"
This was an elusive problem since I have thousands of rows. Maybe there's a tool out there than can find and notify you have csv inconsistencies easier than how I did it. Anyway, at least I know my problem. Now, I have to figure out how to hand incoming garbage data from a SQL server database.
Have a good night.
Todd
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2017 06:49 PM
Hi Tod,
Thanks for the update.
Does that fix your other problem too? (How to resolve loading CSV data when SN automatically creates column with non-standard ASCII charact... )
Cheers,
Joel
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-05-2024 02:31 PM
Hi Todd,
Thanks for sharing the information. I am interested to know how you have dealt with the incoming garbage data from SQL database, later on ?
Thanks,
Gopi