Hi All,
I'm just doing some testing with LF Cloud, and running through a few demo scenarios and hypothetical configurations.
One common scenario I've come up against, is bulk processing. It seems that the search repository activity in LF workflow (cloud) is limited to 100 entries:-
There are many reasons you'd need to bulk update entries in the repository, from data cleansing, migrations and other actions such as RM.
For example, let's say you have documents in the system relating to customers/clients or even a student in an education setting, for arguments sake, lets say you have 1000 documents associated with that entity. So under records management, their relationship with you has come to an end, traditionally you would use workflow to bulk assign the RM actions such as filing date, and perform cut off etc.
How would this scenario work in a cloud setting, as the search repository activity is limited to 100 entries? I suppose you could put all the activities in a repeat loop, but 100 entries does seem a little low. Perhaps it can be changed to be 10,000 entries like it is, on the on-premise version? That's a little more manageable.
Open to any thoughts/ideas anyone may have!