I am doing a conversion for a customer. This is just one cabinet that has around 500,000 pages that need to be stitched together into documents.
Here is the base workflow and it is working. Basically I have a few put together with pages and their locations (already imported the folders to Laserfiche). For each row I look for the page and if I find it, move it to a folder, put some metadata on it, and rename it. If I don't, write to a table so I can review.
Essentially the workflow is moving the pages to a folder called doc ID and the images are renamed docID_Pagenumber. I then need to run another workflow to merge them.
Anyway, Workflow has been running for days and is only around 2,700 documents in. I think there is roughtly 23,000 documents? Where is the heavy lifting being done? The initial query took a couple minutes to get rolling but once into the loop I figure it would be faster than this.
SQL is on a different box and I don't have access to the desktop so cannot see resources. On the WF box we are sitting at around 25% CPU for Workflow.
I know its a lot of data but with a trial run of just this cabinet the time frame is not looking the best. Any resource updates we could do? WF server has 4 cores, not sure if more would help or if the lifting is on the SQL side of things.
Thanks!