How do you audit the quality of knowledge articles attached to incidents?

NancyS1
Tera Expert

We have implemented the Knowledge-Centered Service (KCS) methodology in our department and we're tracking the contribution rate - the number of times the incident tickets have either a knowledge article attached or the assigned resource checked the box to create a knowledge article from the incident ticket.

We're just getting starting out, but the attachment rate is really starting to grow. I did a little looking around and I saw that a couple of people are attaching articles that have nothing to do with the issue. It could be they need additional training or they may be trying to get the numbers up.

My problem is trying to see what's going on has been a very manual effort. Is there any report, either out of the box or one I create, that can show me whether the article has anything to do with the issue in the ticket?

I'd love to know how or even if others are auditing this type of information. Thanks

1 ACCEPTED SOLUTION

Uncle Rob
Kilo Patron

Without the use of an advanced AI, it'd be hard to understand how "this collection of text, categories, and CIs" has anything to do with "that collection of text, categories, and CIs".

Additionally, I wouldn't try to manage that at the micro level.  It's in nobody's self interest to relate irrelevant articles at scale.  One thing you can count on is that nobody actually WANTS to do data entry on ServiceNow, so they're primally instinctively looking for the least amount of transactions.  Put another way:  If the knowledge isn't being flagged, low rated, or unused, why should we care?


View solution in original post

7 REPLIES 7

Uncle Rob
Kilo Patron

Without the use of an advanced AI, it'd be hard to understand how "this collection of text, categories, and CIs" has anything to do with "that collection of text, categories, and CIs".

Additionally, I wouldn't try to manage that at the micro level.  It's in nobody's self interest to relate irrelevant articles at scale.  One thing you can count on is that nobody actually WANTS to do data entry on ServiceNow, so they're primally instinctively looking for the least amount of transactions.  Put another way:  If the knowledge isn't being flagged, low rated, or unused, why should we care?


Michael QCKM
Tera Guru

Nancy,

Terribly sad, and terribly common.  Likely someone has been telling the staff that they are being measured by their Activities... like; number of article attachments, number of articles created, number of feedbacks submitted, etc.

These are the poisons sure to corrupt your KB. You'll get what you measure, but not always what you want.

I wish you the best of luck getting what you need out of ServiceNow OOB, but PLEASE SHARE if you do.

What I have done to more easily identify these bogus attachments is create a report against m2m_kb_task table.  I'm showing the Task Short description next to the Article Short description to get a feeling for if they mesh.  If the wording seems totally off between the 2, we review the ticket to see what may have transpired, and then communicate with the Agents as needed, train, etc. on KB usage.

P.S. I sortof agree with Robert, but we have some Agents that don't understand the big picture.

Thanks for this information. Very helpful!

Michael - this is very helpful. Thanks for your response!