You are viewing limited content. For full access, please sign in.

Question

Question

How can I update a Lookup Table from an External Data Source

asked on April 7, 2022

I'm sure I'm missing something probably easy, but I can't seem to figure this out. I have a Powershell script that polls Active Directory every hour and exports all users to a CSV on my PC. I've created an External Data Source in Laserfiche and pointed it at this CSV, then created a Query Rule to Select the data from the Users table. Next, I created a lookup table and populated it with the same CSV to use in a lookup table rule of my form.

What I'm trying to do, and can't seem to figure out, is create a workflow that will run regularly and update the Lookup Table with the information read from the External Data Source.

I created a simple workflow of "Run Rule (Query the Users table in Data Source) -> Replace Lookup Table Data (Source Data is "Activity -> Read Rule Results File").

When running the workflow, Run Rule runs fine but Replace Lookup Table errors with "Bad Request Status: 400 Response: "DM81303: Could not parse file as CSV file"". I'm not sure where to look to fix this and in Cloud I don't have any CSV options in workflow.

 

Any help would be appreciated. If there is a better way of accomplishing what I'm trying to do, I'm open to any suggestions.

0 0

Answer

SELECTED ANSWER
replied on April 8, 2022

There is a rule in cloud that can replace a simple table's data? I don't see this anywhere, in the past I had to create a delete all rule and an insert rule to re-insert the new values.

0 0
replied on April 8, 2022

Thank you! I've been beating my head on this and didn't even think to go this route.

 

And yes, there's a "Replace Lookup Table Data" in cloud. It works well if you feed it a CSV.

1 0
replied on October 4

I didn't see anyone post this so I thought I'd help the next guy. I ran into this error as well, and realized that I needed to use the 'Download electronic document' first, and then that shows up in the activity for the 'replace lookup table data'. Worked great after that. 

 

 

1 0
replied on October 7

With the release of the Lookup Table API, you an cut out this workflow altogether and update the table directly from CSV. See an example on https://developer.laserfiche.com/docs/guides/guide-lookup-tables-cli/

0 0

Replies

replied on April 8, 2022 Show version history

If you can use Import Agent to get the CSV into your cloud repository, then you can use that electronic document in the Replace Lookup Table Data activity.

I would recommend this approach, because if you use the delete/insert that Chad described, there will be an interval where queries to the lookup table might find no data or partial data. The Replace Lookup Table Data activity operates as a single transaction, so any use of the lookup table will use the complete current data or the complete new data.

1 0
replied on October 19, 2022

This is more a question for Laserfiche. Is the functionality coming that would allow updating a Lookup Table via a Query?  Thus I:

  1. Create a Lookup Table
  2. Create a Query that pulls data from a Data Source (e.g. on-prem SQL table accessed via a Remote Worker)
  3. Populate the Lookup Table using the results of the Query

It would be nice if it was built directly into the the Rules area but even doing it via Workflow would be acceptable.

0 0
replied on October 19, 2022

In this case it would be nice if we could feed this activity a comma seperated multi-value token in addition to a raw CSV as a file. 99% of data comes from either REST API calls or on-prem databases using the Windows agent.

This data can be itterated through and put together into a comma delimited format, but not as a raw file in the repository as far as I know.

0 0
replied on October 19, 2022

Or if they need it as a CSV, take the query results, convert it to a temporary CSV behind the scenes, and then process it.  From our side it is hard to provide seamless automation that still requires the manual creation of a CSV to get lookup data into the system.

0 0
replied on February 2, 2023

I agree with @Beau Lollar, this feature would be a huge productivity boost.  We have numerous tables that are updated at most every few hours.  why have a process that has to generate a CSV somewhere, upload it and have workflow process it.  Such a waste of time and more potential points of failure.  Let's get this feature added (if it isn't hidden somewhere already)

1 0
replied on February 3, 2023

So what you are asking for is the ability to directly feed the text to the Replace Lookup Table Data activity, instead of requiring a repository file?

By the way, as as alternative to .XLSX and CSV, try putting the following into a .json (or .txt) file and importing it as a lookup table:

[
     { "name": "Chris", "title": "Chairman" },
     { "name": "Karl", "title": "CEO" }
]

 

0 0
replied on February 5, 2023

That or to do a remote query to sql in order to replicate some or all of another database table.  In this specific use case we capture client data in a web site and wish to use some of that in a lookup in some forms.

0 0
replied on April 8, 2022

DataQuery rule's query result is in kind of a ADO.net SQL table format.

[Activity -> Read Rule Results File] is  a XML serialization version of the above format data, which is NOT the original CSV file in your machine. But what [Replace Lookup Table] expects is the CSV format, so it throw the "Could not parse file..." error.

So the use case here is to automatically upload a csv file to update/replace lookup table. DataQuery rule is not the mechanism to address that.

 

Does Chad Saar's work for you?

0 0
replied on May 10

Just wanted to check back in on the functionality coming that would allow updating a Lookup Table via a Query? Essentially be able to run a rule that queries the SQL tables and then updates the lookup table.  I am able to set this up in Workflow but Laserfiche advised in March that it is not designed to work this way and thus errors out.

 

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.