I am doing a conversion and really running into trouble with speed and workflow.
I am on the second to final step in converting one cabinet.
The setup is that all the pages for a document are sitting in a folder named their doc ID. Each page is named docID_page number and I am merging them together. I believe the workflow is efficient as there are no searches and I am capping the number of loops.
So find all the document ID folders, then for each doc ID find all the pages in it, create a new document and merge the pages together. The conditional branch is to stop the workflow after so many loops and it becomes unusable quickly.
This is working fine but there are 65000 documents left. I capped the loops at 500 and started again.
It's like 1-2 documents per minute? Thats like 45 days.... That can't be how this is going to work - this one of many many cabinets and we would never finish.
They have 8 GB of RAM on the server and 4 virtual cores. Is there anything I am missing? Are there settings to where workflow can take up more cores or resources? I understand that there are many pages per documents (sometimes) but still this seems to be incredibly slow.
Thanks,
Chris
Edit /Update - Alright well I kicked it into two workflows. The fist find then invokes a workflow for each document and passes forward the current name and path I need. Will still take some days/weeks but that is certainly better than what we were looking at before.
Another bump up for the idea of a "lean" workflow designer or something that does not have the overhead of lots of additional items.