Hi,
I am implementing a document processing solution.
There is currently just one server running Win 2016 in an AZURE environment with 16G of RAM and 4 core processors. It has LF Server, Import Agent, WF and an imaging software similar to Quick Fields. I installed all components using 10.3.1 ISO.
Every alternative day, up to 3000+ documents are scanned using a high performance scanner. These documents are uploaded to a monitored folder. When the documents are uploaded, a workflow starts and that workflow has 15-20 steps.
There are days when Import agent crashes (even thought it is set to recover) which results in document pile-up and when the service is restarted, CPU hits 100% and causes a lot of issues such imported documents missing pages or documents going to Error folder or diminished OCR quality, etc.
I understand that a distributed environment would function very well but the customer wants to explore if its possible to alter the solution without needing to have a whole new server that is only utilised infrequently. Or what if we will run into the same problem in a distributed environment.
Is it possible to configure Import agent to pick-up a file every 5 or 10 seconds? So that the workflow service has enough time to run it's course?
Or
Is it possible to introduce a 10 second delay in a workflow using a script if not delay activity? so I will schedule a workflow every hour and then trigger sub processes for-each entry in 10 second succession.
Or
Is it possible to limit the amount of cores used by Import Agent to half of whats available?
Or something else?
Thanks in advance.
Adarsh