- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2014 06:30 AM
I am faced with a company split and because of a lack of time we have cloned the current instance over the 'new' companies OOB production instance. We know we want to keep all records related to employees that are going to the new company but want to purge users and data from the old company. Apart from being time consuming I'm pretty sure that sorting records and deleting will be easy enough I'm just trying to consider what the future implications would be of deleting records. Was wondering if anyone might have some input or had been through a similar situation.
Thanks,
Tracy
Solved! Go to Solution.
- Labels:
-
Service Mapping

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2014 11:20 AM
HI All,
If I was faced with this situation, I think I would have to go with the idea of exporting all old data into a repository for safe keeping (just in case it's needed later), and then filtering (via list view) to the point of having the data I want to KEEP. Then you can manually delete table data one table at a time using the Tables and Columns "Delete All" functionality.
So:
1 - Export all tables (the entire table with no filters) you think you may need into an folder/repository for safekeeping
2 - Filter the list down to data you want to KEEP and perform another export - this will be imported back in later
3 - Delete all table data using the Delete All functionality under Tables & Columns
4 - Import the file you created in step #2 so that you have the needed data.
Obviously this will be messy and there are other considerations to think about, but this would be my basic plan of attack. It may be a good idea to keep one of your instances as-is just in case there's a need to go back (clone).
Good luck!!!
Jason
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2014 10:26 AM
Using gs.sql is really not a good idea at all unless you really know what you are doing. Even then, stay away. There's a pretty good reason why it is not documented in the wiki.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2014 10:50 AM
I agree about the risks of gs.sql, but it is much faster, for large datasets, that using deleteMultiple.
Either way it requires more thought on deciding how to split/delete the data, know what the impact is on going with either decision (domain separation, data deletion).
Another cleaner way of doing it could be using the idea of the Post-Clone Cleanup Scripts, which includes deleting records and regenerating indexes, but it requires scripting and knowledge of the table relationships.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-05-2014 11:20 AM
HI All,
If I was faced with this situation, I think I would have to go with the idea of exporting all old data into a repository for safe keeping (just in case it's needed later), and then filtering (via list view) to the point of having the data I want to KEEP. Then you can manually delete table data one table at a time using the Tables and Columns "Delete All" functionality.
So:
1 - Export all tables (the entire table with no filters) you think you may need into an folder/repository for safekeeping
2 - Filter the list down to data you want to KEEP and perform another export - this will be imported back in later
3 - Delete all table data using the Delete All functionality under Tables & Columns
4 - Import the file you created in step #2 so that you have the needed data.
Obviously this will be messy and there are other considerations to think about, but this would be my basic plan of attack. It may be a good idea to keep one of your instances as-is just in case there's a need to go back (clone).
Good luck!!!
Jason