You are viewing limited content. For full access, please sign in.

Question

Posted to Government

Question

Auditing - Best Practices

asked on September 25, 2014

Hi All,

I work for a California JPA. We use Laserfiche mainly to file our important documents. We have a publicly posted retention schedule that we adhere to, but before we dispose of any paper documents, we want to ensure that they are being filed properly. We conduct monthly audits of our files to ensure that we have all of the documents we should, and that they are properly filed and named. By filing, I mean that the template and metadata is correct and Workflow files them.

 

I have been tasked with revamping our auditing procedures, so I'd love to hear from some of you how you audit your files? Any best practices you have? Please share your procedures, if they work well for you.

4 0

Replies

replied on September 25, 2014 Show version history

Hi Danielle! Welcome to the Government Group. Excellent question.

 

Here in Palm Beach, we do not do official audits per se, but I had a couple ideas pop into my brain that I wanted to make sure you were aware of:

  1. I did not see you mention Audit Trail at all, which was my first thought of which tool to use. Audit trail can be a little clunky, though. 
  2. Do you export the list of contents from a search?

You could save a search profile on the client to run at certain times (monthly, quarterly). The records management module uses the client search to tell us when documents are eligible for destruction.

       3. What about using the Workflow search tool? That would provide you with the times and paths and if the workflow was performed successfully or not. It would NOT tell you if the file currently exists or not.

 

Hopefully some other folks will chime in here with what they do. Let me know if you would like to to expand on any of the above subjects.

 

Thanks for posting!

2 0
replied on September 26, 2014

Hi Danielle!

 

I'd start with reviewing which errors are you auditing for? Are you looking for bad scans?  Bad Metadata?  Inconsistencies?  Tampering?

 

For much of this, system design and tools can be used to maintain standards and consistency.  Metadata for example - you can set up field constraints to bring some consistency to the fields at entry, and Workflow for some additional field management.  

 

For scan quality auditing, in some places we will use workflow for an approval process.  Add an QC field, Yes or No, and give only a supervisor rights to modify it.  Scan Ops will scan all day long into an Incoming folder, then the supervisor will review and set the QC field to "Yes" which kicks off a Workflow to file the doc.

 

I generally try to design repositories to be self-maintaining and do as much as I can to minimize human input and control.  Build the process well and you can almost remove the possibility of mis-filed documents.

 

If you can reply with some of the specifics I'm sure we can give you a more precise idea.

2 0
replied on October 2, 2014

A few thoughts:

1) For the fields that have old data from dynamic fields, you could search in that field for whatever the old data was. You could compile an Advanced Search with all of the old, bad fields that should not be there. At this point, I would have a Workflow or Business Process run on the retrieved entries that removes the old, bad filed data. Really, you could have a Workflow run and do all the work, but it might be helpful to see how many problems you have.

 

Here is an example of the Advanced Search with multiple field names:

{[]:[Field Name]="Bad Field 01" | "Bad Field 02" | "Bad Field 03"} & {LF:Name="*", Type="DB"} & [LF:LOOKIN="Example Path\Subfolder1\Subfolder2\"}

 

2)I attended an ARMA seminar where a presenter gave me a great idea. Maybe you can put it to use...

He had a system (not Laserfiche) that validated that what a user says a document is, actually is the document. What I imagined when he was talking about this was a Quick Fields zonal OCR session that ran on the page to confirm that it matched what the user has chosen in the metadata or folder path. 

 

When you are talking about validating is a page was scanned correctly, it makes me think of Quick Fields performing a test on the page. Do you have Quick Fields? This may get you the rest of the way there.

I hope this gives you some ides :)

2 0
replied on October 2, 2014

Thanks for the responses. My idea was actually to export list contents from a search and identify a lot of errors from there, so thank you, Chris, that's what I was thinking. I set up the column structure for each template type that we have so that I can run a search for Governance documents, apply the column structure for governance docs and export the list contents and analyze them through Excel - LOVE this feature and I use it a lot.

 

This will allow me to identify any missing fields, any mismatched metadata. We do use dynamic fields, but they've gone through revisions over the years so that's where we get mismatches. This will also help me identify where we may be missing files, if we have no files of a certain type for 2007, but we have a lot for 2003 - 2014. The only thing that this will not be able to do is check for bad scans, verify that it is the document it says it is, that it has no missing or extra pages, or anything else that requires a real live person to look at the document.

 

Thanks, Michael, that was along the lines of what I was thinking too. I am just trying to ensure that our documents are accurate. We may have a document that is filed as an FPPC form 700, which is destroyed after 7 years, that is really a JPA signature page which we need to keep forever. Or we might have a document that was scanned so poorly it's illegible and would benefit from rescanning and we can do that prior to destroying the paper copy.

 

I was thinking about setting up a few user approval inboxes for some of the managers. I would grab a select few documents- maybe 10-20 a month and throw a shortcut into their folder for them to QC. I can set up notifications and reminders on these and they won't go away until the manager has marked the file as audited and notes whether it requires corrections or not. We already have an audited tag that is set up to then assign a history to the document that says "Danielle Reich performed an audit on 10/2/2014 at 11:26 am." So, we do have a trail of who does the auditing.

Thanks for the ideas, guys, this is kind of what I was thinking so it's good to hear I might be on the right track. I'd love to hear more ideas of how to refine the process. Sorry, I didn't get notification that there were responses on this or I would have responded sooner.

 

1 0
replied on October 2, 2014

A quick note - Quick Fields would work well here for the page validation.

The users scan into an incoming folder, Quick Fields could use Form Recognition (I assume it is still called that - I'm more familiar with QF 7 than the current version) Zone OCR, or some combination of modules to recognize the scans and file them away automatically.  

 

If the recognition fails, it doesn't file and the scan operator gets to fix it.

 

As far as the audits, you could use Workflow to randomly select a certain percentage to send to review.  Send the notification, and set a deadline for the response.  You could even handle the notification of rejections too.

 

So many possibilities!

 

1 0
replied on October 3, 2014

We don't have quick fields, and also, some of these documents in question would be coverage documents - getting into the 400+ pages and they're definitely not something that could be standardized. The errors found for example: This is not an excess policy for the 1994-1997 year, this is an aggregate stop loss policy for years 1991-1997. I wouldn't have the first clue where to start figuring out how to have quick fields recognize something like this. The other forms that are standard - they are almost never mis-filed.

 

However, the second point about workflow randomly selecting certain files to review, I had that thought, but I have no idea how to accomplish it. So, I was going to backburner that idea for a bit until I figure out a way. If you have ideas on how this could be accomplished, I'd love to hear them! I hate having to do anything tedious by hand, I'd much prefer to spend a few hours trying to figure out the logic.

0 0
replied on October 3, 2014

How about if you have a Workflow with starting rules:

if a document within a certain path and

has a field set 'xyz', then

retrieve the text from the pages and set a conditional sequence for the text to 'contain' "1994-1997" or "1991-1997" (or whatever qualifying text).

If the required text is not present then have Workflow email you (or the creator) a shortcut to the potential culprit.

Instead of an email, Workflow could create a shortcut to the culprit(s) in a folder that you designate. 

1 0
replied on July 7, 2015

Hello Danielle:   I was just curious if your question was fully answered or would you like further insight?

THanks ahead of time!

 

Matthew Kernodle

Sacramento

96 502 4785

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.