IH-ETL FAQ - Unable to find format of the single column
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎03-26-2024 02:17 PM - edited ‎03-26-2024 02:18 PM
1. Create a nested payload JSON format data source using custom script type, mark "data in single column" option to true
2. Pull an import set from this data source
3. Go to IH-ETL, create a new mapping base on this data source
4. Go to IH-ETL second step "Preview and prepare data"
5. Error shows "Unable to find format of the single column. Verify Data Source configuration is correct or contact your CMDB Admin for help."
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎03-26-2024 02:22 PM
If the sys_data_source has "data in single column", it is expected to have u_data column of import set table to be of type JSON.
Thus, in the data source's custom script, you must use below method to create a JSON type column
import_set_table.addJSONColumn('u_data', 5000);
Otherwise the data column will be string type, and caused the error.
Refer to platform custom script data source type guide:
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎03-28-2024 08:33 AM
Hi @Siyou Li,
I'm trying to import CMDB data from a 3rd party tool using the same Custom (Load by Script) type of data source. I have a nested payload which has all related info about a CI, I have added the JSON type of field as mentioned above and checked the "Data in single column" option, hence why I'm using the IntegrationHub ETL for this.
However I'm presented with this error even after increasing max length to 65,000. I even changed the data source to JSON file type and imported data using the same nested payload, but to no avail the same error precipitated. The error appears in this step: Go to IH-ETL second step "Preview and prepare data"
Error: Unable to retrieve schema from import set possibly due to invalid field names. Field names must start with a letter (between A-Z or a-z) or with '_', and must only contain letters (between A-Z or a-z), digits (0-9), or '_'. Verify Data Source configuration is correct or contact your CMDB Admin for help.
I have searched around but didn't find anything useful except that the field names should be properly named. Any feedback would be highly appreciated.
Thank you,
Suhail
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎03-28-2024 02:16 PM
Hi @vdsvds ,
As the error message suggested, and in most cases we encountered, it is because your raw payload has incompatible characters in field name. Please double check your payload field names comply with the rule mentioned in error message. After that if this issue still exist, please reach out to ServiceNow support, and if support cannot resolve the problem, they will assign a task to my team to further assist you.
Thanks,
Siyou
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎10-01-2024 06:22 PM - edited ‎10-01-2024 06:29 PM
This is just not true. I have tried multiple different approaches to try and solve for this issue with even very basic JSON objects and it just doesn't work. The "platform" itself with these "low-code" implementations is fundamentally flawed as it fails to be able to simply pass around JSON objects as one would normally expect, esp. with consuming data from endpoints.
And the doco you pointed to above doesn't work either. As when the Data Source is processed within an ETL, it doesn't even parse that JSON field and its payload into an JSON object to use.
And let's not even mention after finding out that the Data Source of Data Stream doesn't work as expected and changing it to Custom to add "the fix" above breaks the Data source entirely, and does not load the script fields correctly.
And no amount of support will solve for this.