You are viewing limited content. For full access, please sign in.

Question

Question

Overhead of QF Agent Session using Lookups

asked on November 21, 2014

I am working on a Quick Fields session with 47 document classes. Debugging for them is quite a process, but I have managed to get it working pretty well so far. I am ready to institute it as part of a process using Quick Fields Agent, and work to make modifications based on the number of highest volume documents we find during initial testing and production usage. 

 

I would like to track the number of documents in each document class that have been identified, and then the number of documents that successfully got stored to the repository after capture. I have a workflow that acts on the files immediately after Quick Fields processes them, so that handles tracking the documents that successfully stores the captured documents to the repository. 

 

What I am wondering, is if anyone has had any experience with Lookups in a session and by using one in each document class to initiate a stored procedure to increment the counter, what type of repercussions that may have to performance and consistency. Will a failed lookup cause the entire document to have errors?  

 

Additionally, is this something we can actually make into a feature for Agent situations?

 

Lastly, I am thinking it may be easier to instead of using a lookup, just creating a workflow that the document class with initiate on identification, which will increment the counter for identified document classes outside of the Quick Fields Session. Does anyone have any suggestions as to which will give me the best performance?

0 0

Answer

SELECTED ANSWER
replied on November 21, 2014

Your processing speed will be affected by having each document connect to SQL, update a table and then disconnect. But you would get an accurate count because Quick Fields document processing is sequential. Workflow on the other hand is multi-threaded, so multiple instances would be trying to update the same table at the same time. The count wouldn't be accurate.

If your document classes set a unique field, Audit Trail would be the best tool to get this sort of stats on documents created.

If Audit Trail is not an option, then I would parse the QF Agent logs (C:\ProgramData\Laserfiche\Quick Fields Agent\SessionLOgs\<session name>\<run id>.log. They contain the processing summary with a count for each document class:

Created document of class Invoice.
Added page 1.
Created document of class Invoice.
Added page 1.
Processing Summary:
2 page(s) processed.
2 identified document(s).
    2 Invoice (2 pages)
0 unidentified document(s).
0 error(s) occurred.
Pages Per Minute: 1539
Document 'MASTER' was successfully stored to: \QF\release90\MASTER
Document 'SAMPLE' was successfully stored to: \QF\release90\SAMPLE
Storing documents summary: (For identified documents)
2 pages stored.
2 documents stored.
0 unstored documents.
0 errors occurred.

How you go about that would depend on the resources you have available (writing a command line utility to read them, look for the data you want and insert it into SQL would require some programming knowledge, but it would be more straightforward. Having Quick Fields Agent text extract the logs and pattern match through them would be more accessible, but it would require more work to convert .log files, which QF would treat as electronic, to .txt, which QF can text extract, as well as some scripting to make sure you only process these files once.)

0 0

Replies

You are not allowed to reply in this post.
You are not allowed to follow up in this post.

Sign in to reply to this post.