You are viewing limited content. For full access, please sign in.

Question

Question

Workflow database not shrinking

asked on March 1, 2018

I have the advanced server options set in the WF Admin Console to remove reporting and instance data older than 1 day in hopes of reducing my 26.4GB workflow.mdf file. 

 

 

I have to run a mass import (~150,000 documents) for a customer in the near future.  I ran the import once as a test run which ballooned the mdf up to 26GB.  I then modified the settings as shown above, but the database has not reduced in size after about a week of having these new settings.  I have also tried to manually shrink the database and files from Sql Mgmt Studio with no change in the workflow mdf file.

 

Is there a way I can analyze what is occupying so much space and more importantly, how can I get workflow.mdf to properly wipe data to make sure the file doesn't grow any larger after the true mass import?

 

Please let me know if there's any more information needed, or if anyone has an idea!  Thank you in advance!

0 0

Replies

replied on March 1, 2018

This advice typically applies to large transaction logs rather than the mdf but... I've found that you may have to change the recovery model on the WF database to "Simple" and then back it up (not always though) and if it is still very large, run a shrink on the database or the individual files.

I can't think how importing just 150,000 documents, which isn't huge by today's standards, would cause such a large increase in size given the maintenance settings you have configured. I'd guess something like having track tokens in the the workflow(s) would increase the size but not to that extent!

0 0
replied on March 1, 2018

I appreciate the message, but the log file isn't an issue - only the mdf.  The track tokens certainly don't help the workflow mdf size, but as long as WF Admin Console wipes old data, then it shouldn't be an issue.

Still not sure if I'm missing a setting somewhere that would trigger the deletion of old data, or why the maintenance doesn't seem to be occurring.

0 0
replied on March 1, 2018

When you attempt to shrink the database in SQL Server Mgmt Studio, it will tell you how much free space it has. You could also run a report for disk usage by (top) table to see where most of the data is.

 

0 0
replied on March 5, 2018

It looks like the "Search_activity_messages_log" and "search_activity_log" are the two largest components of the WF database.

This question was less about how to identify the largest tables and more about why Workflow is not regularly deleting data as configured from the WF Admin Console.  I need to make sure that WF database maintenance is working successfully to avoid running out of space on my subsequent import processes. 

 

Do you have any ideas how to ensure that Workflow is properly deleting data from the database?  Thank you, as always, for the response, Miruna!  Hope you are well!

0 0
replied on March 5, 2018

Check your database maintenance timer in the Workflow Admin Console. By default, it has a fairly small work window so it does not interfere with performance. If there are lots of instances to delete, it may take a while to purge them from the database. You can increase the maintenance window temporarily to handle the large backlog.

1 0
You are not allowed to follow up in this post.

Sign in to reply to this post.