You are viewing limited content. For full access, please sign in.

Question

Question

Any ideas on how to trigger workflow in batches?

asked on September 19, 2022

We have a workflow that is pretty simple. It queries a SQL table looking for a Null Value and greater than 7 days of the notification date. If the value is just Null it immediately sends an email. We run it daily. The problem is when we send this email over 1000 people try to access the Forms at once and they get a loading page. (This is a separate issue we are working through with our VAR and another post here on LF Answers)

I'm looking for ideas to send the email to the remaining 6k+ employees without directing them to the Form at once. Maybe batch it together or set some kind of delay that runs every so many mins while working through the list. Delay in Workflow is 1 min to it would take 4 days to notify the end users. We have a hard 3 day max to notify employees.

I don't think I can go back in and add anything to the table because of the other 2k responses and I'm not a database guy. Any suggestions would be greatly appreciated.

0 0

Answer

SELECTED ANSWER
replied on September 20, 2022

You don't really need the token counter. You could use the iteration token in your condition and check if it's a multiple of 600 to make it pause every 600 results.

3 0

Replies

replied on September 19, 2022

you can use the conditional decision to send as many emails at once and then wait for a period of time before continuing, In this example I used 600 as the number and 1 hour waiting period, this way it will take 10 hours to send 6000 emails spaced 600 at a time every hour.

4 0
replied on September 20, 2022

Thanks Gerardo. Can you provide a screenshot of the Counter Tokens?

0 0
replied on September 20, 2022

Here is the actual workflow. keep in mind that is not as clean as I wrote it in like 5 minutes. The counter token is just a regular token created with the initial value of 1 (I just named it counter so it would make sense). The conditional decision is based on that token being less than 600 (in this case) and the counter+1 just modifies the token to add one to it. When the decision is no longer met, the process would fall on branch 2 (we need to send the email here as well as you do not want to miss any emails) the delay is set to 1 hour and then the counter token is modify to be back to one and the process continues to the next set of 600. Instead of the email, you could call a different workflow that you already have and pass the data from your query but this is small enough to integrate into any other workflow. Hope it helps. You need to rename the file to have the wfx extension in order to be able to import it into your workflow designer.

batch of 600.txt (24.01 KB)
0 0
SELECTED ANSWER
replied on September 20, 2022

You don't really need the token counter. You could use the iteration token in your condition and check if it's a multiple of 600 to make it pause every 600 results.

3 0
replied on September 20, 2022

Way cleaner and simpler.

0 0
replied on September 23, 2022

Thanks everyone. We ran it like this yesterday and it worked perfect. We have other all hands processes we will be adding it to.

0 0
replied on September 19, 2022

We've done something like this with internal forms. We didn't want 1000+ people trying to do it at the same time, so we break it into groups based on some kind of identifiable attribute.

Usually we would break it up by Employee location. We set the workflow up to run on a subset by adding an input token to the workflow and adding that as a query filter.

Next, we set up the "scheduler" workflow that loops through each location with a delay after each iteration. The scheduler workflow starts each batch by passing in the location/filter.

1 0
replied on September 19, 2022 Show version history

You can use hidden fields to temporarily mark items with workflow to be ignored in your next search. If your clearing them out at the start of a new day (setting a value to blank on a non-templated field will remove it). Create a field called WFTag or something and set the security so no one can see it, it can be appended to any entry even if it is not part of a template assigned to the entry.

Then in your search, exclude entries that are already tagged and also limit the results to some amount X.

Send the emails and delay invoke another workflow. Search again and send another X.

Your search amount could be dynamic based on current load, current form submission load (assuming your Form's process archives something to the repository).

Let's say your delay is every Y minutes until you run out of entries to notify on. Each new workflow runs a search for new submissions to get a count, the search count maximum, X, is then the inverse of submissions. IE: If only 10 submissions have come in within the last 5 minutes the server is not busy and you can send out 90 more emails. As the count of submissions received in the Y interval increases, less and less emails go out in that interval. As the count of submissions received decreases, more emails go out.

1 0
replied on October 9

Hi Laserfiche Experts,

I have recently been reaching an activity limit for some workflows (10000 activity limit), do you all know if having a delay like this per batch (ex: 500 iterations) can help avoid that?

I am wondering if this would help remedy our issues hitting the workflow activity limit, as even calculating the amount of records and the max possible activities with the records we are touching we can't even get to 10,000 activities...
 

0 0
replied on October 9

I believe that 10k limit is the loop limitation under WF Admin > Properties of Advanced Server Options which can be increased.

However if your checking over thousands of documents again and again you might consider only checking these documents on change, since if they are not changed, there is no reason to cycle through all the information again.

0 0
replied on October 9

Hi Chad,

I am on Laserfiche online (Web) would you happen to know if these loop limitations can be edited on that end?

We aren't checking over thousands of documents again and again, we do an annual check for which documents have past retention and are approved to be destroyed. We have made a workflow to permanantly delete these documents, and that is what is hitting the 10000 activity limit, even if a for each entry loop only iterates ~1430 times (max 3 activities act on each entry).

0 0
replied on October 9

Oh I have this same problem with cloud when I want to populate lookup tables from a SQL table so that Forms can use the relational data and I have to break my workflows up into dozens of workflows to get around this limitation. There is no option to increase it. The crazy thing is, how does breaking up our workflows mean less work for the computer, why not let the computer just do all the work in a single workflow, the total amount of computation remains the same, if not less. I don't understand the reasoning for breaking up workflows like this.

How are you finding all these documents when the search can only return a max of 100 docs.

0 0
replied on October 9

Hi Chad,

They are doing the search using Records Management in the Repository highlighting "Records eligible for disposition", using Records Management can search more than 100 docs. 

After those documents are searched, users generate a report (excel) to go through a business process to request Destruction approval, after that process goes through the excel can load a lookup table, we use that loaded lookup table to permanently destroy documents (basically the same thing as destroy). 

It is really sad that LF has this limitation for LF Cloud, the workarounds to get around them always add some extra development time/or added inconvenience for the end user.

0 0
replied on October 10 Show version history

Ah so your using a lookup table. Yes in this case what I do is rows 1-9000/A, where A is the number of activities in your loop, for the first workflow and 9000/A+1 - 18000/A for the next workflow and so on.

Yea when I have to update the workflow, sometimes I have to do 24 times as much work (since larger quantities of A can really reduce the number of rows you can handle). All to save the computer from doing work. More human labor, less computer labor. Oh but the real ironic part it is not saving the computer any labor, so it is just more labor.

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.