- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2016 12:54 PM
We have a potential integration of a business department, with a potential 300k user records to import via some way we haven't figured out as of yet. The question I have as well as others that have asked during the meeting we had: Does having a large user database affect performance in any way ? Let me know your thoughts, thank you.
Solved! Go to Solution.
- Labels:
-
Performance Analytics
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2016 01:39 PM
Just having a large user database shouldn't affect performance, however it can affect the performance of some scripts or imports. Particularly things that loop through a long list of users. There are a lot of posts and discussions about best practices for scripting and what not. Here are some things I can think of off the top of my head:
- For import sets make sure any coalesce field is indexed. This is a HUGE time saver.
- Also on import sets, try to limit scripts that run on each item to simple things. If you have to do some type of database lookup for 300K records as they are imported, the import will take a long time.
- When scripting, use filters when getting GlideRecord queries. I've seen some people just return all the user records and then scroll through them in the script to find the few they are looking for. It is MUCH faster to let the database filter the records before returning the results to your script.
- On forms with reference fields to the user table, use reference qualifiers when possible to limit the number of user records that it tries to autocomplete or show when you click the lookup icon.
- When possible, use default filters on list views to limit the number of records shown.
I'm sure there are other things to consider, but these are the ones that come to my mind right away.
-Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-09-2016 01:39 PM
Just having a large user database shouldn't affect performance, however it can affect the performance of some scripts or imports. Particularly things that loop through a long list of users. There are a lot of posts and discussions about best practices for scripting and what not. Here are some things I can think of off the top of my head:
- For import sets make sure any coalesce field is indexed. This is a HUGE time saver.
- Also on import sets, try to limit scripts that run on each item to simple things. If you have to do some type of database lookup for 300K records as they are imported, the import will take a long time.
- When scripting, use filters when getting GlideRecord queries. I've seen some people just return all the user records and then scroll through them in the script to find the few they are looking for. It is MUCH faster to let the database filter the records before returning the results to your script.
- On forms with reference fields to the user table, use reference qualifiers when possible to limit the number of user records that it tries to autocomplete or show when you click the lookup icon.
- When possible, use default filters on list views to limit the number of records shown.
I'm sure there are other things to consider, but these are the ones that come to my mind right away.
-Steve
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-10-2016 06:39 AM
Steve,
Thanks for your response, I've seen a few references to some of the things you mention in other post's I've read through. These tips will help when we decide what we are going to do going forward. Thanks again!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
03-24-2017 03:44 AM
Hi,
We imported many types of external contacts, customers etc etc as users at a client. We ended up with 1M++ rows and the performance was as expected (without countermeasures) significantly impacted. Table extensions or to use other tables than sys_user was beyond the loss of ootb functionality also evaluated as impractical and spread out.
In order to avoid any performance degradation I added Checkboxes (list = impractical, choice = invalid due to multiple) to split the users apart and then indexed the checkboxes. If a specific type of user is not specifically stated in the query ( such as req qual from reference fields ) a before query business rule on the sys_user table adds to exclude thoose by default.
Example: If the sys_user query does not state u_customer = true, then automatically add u_customer=false to the query. By the way, the customer query also includes a setLimit(50) which might be a good idea in terms of security risk mitigation. I also disabled all of the list export and reporting availability for the sub-set of the table.
By splitting the table by indexed fields and adding the query default business rules the performance was very much improved. Please note that the conditions and rule evaluation should never be designed with any risk of adding overhead or interfere with single record / get querys / specific sys_id / non-interactive sessions as single record querys are constantly executed by the system (ACL evaluation etc etc).
Also, dont forget ACLs