You are viewing limited content. For full access, please sign in.

Question

Question

Bulk Entry Processing in Cloud

asked on February 9, 2022

Hi All,

 

I'm just doing some testing with LF Cloud, and running through a few demo scenarios and hypothetical configurations. 

 

One common scenario I've come up against, is bulk processing. It seems that the search repository activity in LF workflow (cloud) is limited to 100 entries:-

 

 

There are many reasons you'd need to bulk update entries in the repository, from data cleansing, migrations and other actions such as RM.

 

For example, let's say you have documents in the system relating to customers/clients or even a student in an education setting, for arguments sake, lets say you have 1000 documents associated with that entity. So under records management, their relationship with you has come to an end, traditionally you would use workflow to bulk assign the RM actions such as filing date, and perform cut off etc.

 

How would this scenario work in a cloud setting, as the search repository activity is limited to 100 entries? I suppose you could put all the activities in a repeat loop, but 100 entries does seem a little low. Perhaps it can be changed to be 10,000 entries like it is, on the on-premise version? That's a little more manageable. 

 

Open to any thoughts/ideas anyone may have! smiley

0 0

Replies

replied on September 13, 2022

I've run into this same issue and it's EXTREMELY frustrating.  I've had to put all my activities in a "Repeat" where if the search result count is 100, it repeats, evaluating after running the activities.  That lets me get up to the NEXT obnoxious arbitrary limitation, which is 10,000 activities, total.

0 0
replied on July 29

@████████ Could you please share some sample code showing how 'Repeat' is used? Also, how can I verify that the new search result is not part of the previous search results? Thanks a lot in advance for any hints!

 

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.