Hi there,
At a customer site Import Agent is being used for a mass migration of several million records. When storing a document, a call to exec search_in_subfold_by_name_type is made by Laserfiche. As the records are stored in a fairly unstructured folder, the stored procedure creates a pretty massive searchResults table that takes a fair while to populate. The import has slowed from 700,000 to 350,000 pages/day in around 2.5 weeks.
I figure there are two options:
- Change the stored procedure, limiting it a toc iteration of around 1,000 during the import period (users aren't supposed to browse folder during this time, or, ever)
- Obtain advice on how to increase the performance of the script to execute and populate the @SRTables (new indices, partitioned database, fill-factor and statistic recommendations for the TOC table for example)
The customer is performing some database maintenance scripts this today but has advised:
I’m pretty sure that won’t make any difference to the response.
The data is cached in memory – you will see the cpu time matches the elapsed time i.e. there is no I/O going on.
However I can rebuild that table
Performing a sample shows that the pages are at least 60-70% full though, so at best you will get a 40% improvement and this will degrade. (assuming this data is where the problem lies)
The customer would prefer not to break down the folders (if that in fact is the issue) as it would require pushing the changes through many test environments again and they are very close to a go-live.If that's the issue...
-Ben