You are viewing limited content. For full access, please sign in.

Question

Question

Does Quick Fields have a memory limit?

asked on February 2, 2016

I have a Tiff document that has 49 pages.  Total file size is 23,000 KB

When I run this through my quick field session it works well up to the last 6 documents and then seems to freeze. (as shown below)  I can rearrange the page order in Laserfiche and the below failed documents will work and a different set of 6 will fail.

Message count is 637

If I drop any 6 pages and make it 43 pages. It process without errors.

I am seeing this same thing in another Quick Fields session where it does not like any thing over 100 pages.

Is there a memory limitation?

Am I trying to do to much in quickfields? 

 

0 0

Replies

replied on February 3, 2016

The only thing that I can think of, inside Quick Field, would be under Tools -> Configure Scan Source -> Retrieval Limits. I'm not aware of Quick Fields having a memory limit. I have folks that scan in +50 page color PDFs without much trouble.

Outside of Quick Fields, if you have templates being assigned, would be to ensure you don't have any limits there. Additionally, double check hard drive space.

Good luck!

0 0
replied on February 3, 2016

I do not have any retrieval limits set.

Lots of hard drive space.

I am only trying to scan 1 file which contains 49 pages.

1 0
replied on February 3, 2016

Hi Mike,

 

I've had this a couple of times with QF, and on both occasions it turned out to be a disk read/write error on the HDD of the local machine where QF was installed. Misleading error but worth running a diskchk on the machine to rule it out.

 

Hope this helps! yes

0 0
replied on March 11, 2016

This is a bump to see if others have had this issue

0 0
replied on February 3, 2016

That's not a memory issue. Please have your reseller open a support case and attach the original document.

0 1
replied on March 11, 2016

There has been a ticket opened on this. So far no resolution. They have requested that we install QF on a different machine.  Our IT department is reluctant to do this as they feel it is a software issue.

So today I had our third party IT company in testing the system with me.

When running QF session it starts a program called LFOmniOCR.exe

As the pages are scanned and documents created from each page the committed memory climbs. Once it hits 1.8 GB that is when QF fails.

This appears to be a memory issue.  Once a page is processed and document created the LFOmniOCR.exe is not releasing its resources. It just continues to add to the committed memory. Very repeatable with multiple document types and differrent QF sessions

 

0 0
replied on March 15, 2016

That's interesting QF is expanding a 23,000 KB by ~100 times. Since you're using QF I'm assuming the documents are going to a network location first. Are they being stored as a PDF or a TIFF? If they are being stored as a PDF, do you have your sessions setup to store it as an image?

0 0
replied on March 15, 2016

They are being processed from Scanner Tiff and put into laserfiche as Tiff

0 0
replied on March 15, 2016

As a test, have you tried a different file format?

0 0
replied on March 15, 2016

I have not.  However I have had techincal support run the session and my files successfully.

I have installed the latest updates so I am at 9.0.1.496

0 0
replied on March 15, 2016

I must be missing something. So I have a file that is a PDF why can't quickfields process it.

replied on March 17, 2016

We have narrowed the issue down to the Universal Capture Engine.  If we use the Laserfiche Capture Engine, the LFOmniOCR.exe program does release its memory as each page is processed.  However when using Universal Capture Engine it does not release its memory and max's out and fails.

1 0
replied on March 30, 2016

Laserfiche has been able to replicate my issue and will be working on a solution.

 

0 0
replied on May 16, 2016

Update:

As of 3/21/2016 - LF Support recognized this as a bug when using

Universal Capture, but not LF Capture. Quick Fields 10 is supposed to fix this issue.

The targeted release for Quick Fields 10 is Q2 2016 per https://support.laserfiche.com/ProductRoadMap.aspx

Looking forward to the fix.

2 0
replied on June 23, 2016

I have been dealing with this issue as well. I just upgraded to Quick Fields 10, not Quick Fields Server but the client upgrade. The count before it stops scanning is much greater but I have still encountered the same issue since upgrading. I am using Universal Capture to process large batches of pdfs from a network drive.

The error occurs even between batches - I can scan a small batch, have it process perfectly, and then scan the next batch and have it fail when it hits the "magic number." You have to close the session to effectively clear the cache and start over.

0 0
replied on June 23, 2016

Hi Amy, 

Please have your reseller open a support case about this issue. 

0 1
You are not allowed to follow up in this post.

Sign in to reply to this post.