ServiceNow Integration w/ Power BI?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-28-2024 09:53 AM
Hi Team,
Has anyone integrated ServiceNow with Power BI? Our desire is to utilize Power BI reporting functionality with SN data.
Thanks,
David
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎02-28-2024 09:58 AM
Hello @davidslavkin ,
Check this link which talks about PowerBI Integration:https://www.servicenow.com/community/platform-analytics-articles/how-to-set-power-bi-servicenow-inte...
Apart from this, If you want to export data from ServiceNow being master data you can use API REST (REST API Explorer) provided from platform for exporting data from any table (ex:incident)
☆ Community Rising Star 22, 23 & 24 ☆
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎08-14-2024 09:10 AM
Hi @davidslavkin ,
I assume you already implemented a solution, but just in case you didn't, or for others who are seeking a solution, I'll add that until I could enroll a data team to develop a solution that pulled the SN data into a data lake from which I could pull it, I implemented a workaround solution:
1. I created a report in ServiceNow that contained the fields I needed
2. I scheduled that report to run as often as I needed the data refreshed and had the csv export file sent to my Outlook inbox.
3. I developed a Power Automate flow to copy those attachments in my inbox to a SharePoint site
4. I developed a Dataflow to get the data from the csv file on the SharePoint site and cleaned and shaped the data to a star schema data model.
5. I got the dataflow's tables in Power BI Desktop and proceeded to
I could provide more details if desired.
I'll add that our current solution, getting the data from the data lake, also took quite a lot of development effort because the data tables from SN contained a few hundred columns that had to be culled and renamed to business - friendly names that weren't simple changes (i.e., not just removing underscores or uniformly changing case). I'd be curious whether the M code I developed to accomplish that was optimal, but after a whole lot of research and experimentation, I came up with a solution. Again, more detail could be provided.
Julie
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎09-02-2024 03:08 AM
HI Julie,
your reply is very helpful thanks.
There are third party solution for a connector (cost money) and the API solution that is limited in its abilities, and may be complex configuring, so the solutions of exporting csv's and via sharepoint publish into a PBI desktop, or publish into datalake are good and stable soultions.
Could you kindly elaborate on how you implemented the data flow on the sharepoint solution, and in general on the implemantioan of the data lake solution?
Thanks!
Lior
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎09-03-2024 08:39 AM
Hi Lior,
As far as the implementation of the data lake solution, a specialized team created the data lake and ingested the SN, so for my part as the developer, I used a layered architecture to get the data from the data lake. The first was a staging dataflow that got the data. But it got 300 some-odd columns with business-unfriendly names, so I developed a query that got the column names as a columnar list that I copied to Excel where I added a Keep and New column. For each column I wanted to keep, I put a 1, and I entered the new name. (If all the name changes had been the same, such as removing an underscore, I'd have done that programmatically, but many names needed to be completely changed). When that was done, I used the Enter Table command in the PQ Online ribbon and pasted the data from Excel. Then I developed another query to cull the unneeded rows, do the name change, and transpose back from a list to column headers. Again, more detail can be provided about those queries.
A second, linked dataflow took that data and did any other cleaning, standardization and transformation, including re-shaping it to a star schema.
For the SharePoint solution, here's an image:
Instructions for how to create the flow:
Save email attachments to SharePoint with Power Automate – Rishona Elijah (rishonapowerplatform.com)
Let me know if you'd like more detail and please specify about which particular piece of the solution since there are several.
Good luck,
Julie