You are viewing limited content. For full access, please sign in.

Question

Question

SDK Script to read csv and update metadata fields

asked on January 28, 2015

Hi Guys,

 

Hoping someone can help me on this one.

 

I have a client who wishes to drop csv files into Laserfiche and then have workflow interrogate the file and write the cell values to the metadata fields.
When testing I had set up workflow to use "retrieve document text" and then pattern matching which worked but now the user wants to use import agent to import the csv file and IA won't generate the text. I have also tried using distributed cluster to OCR the doc before retrieving the text but still no good.

 

I am now thinking the only way to do this would be the SDK script. I have a script which takes metadata fields and writes to csv but not sure how to do the opposite.

 

I imagine either just reading the text and writing it to a token would work to then pattern match but preferably would like to read the cells and use cell 'A' value as "invoice number" value.

 

Hope this makes sense and someone can point me in the right direction.

 

Many thanks

Tina
 

0 0

Replies

replied on January 28, 2015 Show version history

Hi Tina,

If you're using the SDK, you can read the contents into a token (or a string variable) for processing.

First load the CSV into a LaserficheReadStream object with the DocumentInfo.ReadEDoc method. Then you can read the text into a token or string variable from the LaserficheReadStream object.

-Ben

0 0
replied on January 28, 2015

Thanks Ben, that is the way I was thinking of going with it but means I still need to use pattern matching. Was hoping I could get the script to say the value in Cell A "123456" should populate the Invoice Number field.

As long as I get it working then either way is fine :) thanks again for your input.

Can you tell me if there are any help files regarding LaserficheReadStream objects?

 

- Tina

0 0
replied on January 28, 2015 Show version history

If you export the CSV file, you can use ODBC to read it like you would an Excel or Access file.  That would maintain the row and column layout without using pattern matching.

0 0
replied on January 28, 2015

Tina, there's the SDK.Net documentation that comes with the SDK Kit. That's what I use.

-Ben

replied on January 29, 2015

Workflow can read directly from a CSV file (the "Query Data" activity can treat folders with CSV files like a database using the CSV ODBC driver), which might make things much easier for you. Note that you have to give Query Data a fixed location for the CSV file (the parent folder that holds the CSV file(s) will be treated like a database, and each CSV file inside of it a table). If you can run a workflow once a day to pull information from the CSV file and update the documents in Laserfiche, you can just update the CSV file and let Workflow work its magic.

 

If CSV files will be submitted sporadically, it'll probably be easier to just write a script. Workflow would still handle the vast majority of the heavy lifting, but the script would get the contents of the CSV file, parse the contents (see this Stack Exchange article for more information on parsing CSV files), and place each row into a multivalue token. You're right--it would take a little bit of pattern matching to get the values out of the multi-value token, but it should be simple. You can iterate through the token using a "For Each Value" activity, and for each row, find the document that needs metadata and apply it. 

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.