You are viewing limited content. For full access, please sign in.

Question

Question

Worklow log files

asked on March 18, 2014

 My Workflow log file is 15gb and just wondering why it may be so large and can it be truncated?  Just wondering if something is going on with Workflow that is making it grow?

 

Thanks!

0 0

Answer

SELECTED ANSWER
replied on March 18, 2014

Are you talking about the SQL database's .LDF file? If so, then you can shrink it.

 

Workflow log files are limited in size and get periodically compressed into zip files.

0 0
replied on March 19, 2014

Miruna, 

I tried going into SQL Mgmt and Shrinking the Log file...but nothing happened.  Is there some other method I need to use?

0 0
replied on March 19, 2014 Show version history

Hi Daryl,

 

The log file won't shrink until it's been truncated, backed up, or been set to automatically truncate. Changing the "recovery model" of your database from Full to Simple will set the log to automatically truncate.

 

Setting the Recovery Model:

  1. Log into Microsoft SQL Server Management Studio
  2. Right click on your database.
  3. Select Properties.
  4. Click on Options.
  5. Change the recovery model to Simple.

 

Now try to shrink the log.

 

For those that like to script:

ALTER    DATABASE laserfiche_database SET RECOVERY SIMPLE
DBCC    SHRINKFILE (laserfiche_database_Log, 1)

The Full recovery model is typically used in environment where a daily backup isn't sufficient and restore points through out the day are required. For example, you would schedule a full SQL backup once per day and then backup the log file every hour. This would allow hourly restore points without the performance hit of full back up every hour.

 

Hope that's helpful.

0 0
replied on March 19, 2014

But it seems that from what I've been reading, everyone suggests NOT changing the backup from Full To Simple in a production environment?  But it seems the only way to shrink the log file is for me to make that change.  Am I missing something...is doing the above going to screw something else up for me?

 

Thanks!

0 0
replied on March 19, 2014 Show version history

Hi Daryl,

 

That advice would depend on the environment.

 

For example, I would make the change on a production system but would wait until after hours to make the change (or truncate) then shrink the log, so as not to impact the users. The system is likely to slow down during intensive database operations.

 

Setting the recovery model from Full to Basic removes your ability to take hourly snapshots of the database (and hourly recovery points). However, unless you're also taking hourly snapshots of the document repository and full text index files, this is a moot point.

 

I'd be interested to see the advice you're referring to. Perhaps it was made in a different context.

 

-Ben

 

0 0
replied on March 19, 2014

Changing the recovery mode from Full to Simple is an easy way to get the log file shrunk when you need to free up disk space. Otherwise, it should be part of your backup procedure and SQL will shrink it at that time. You can change the recovery mode back to Full after following Ben's steps above.

 

If you already are backing up the LDF file as part of your backup plan, then having it grow a lot between backups could indicate that you have a lot of activity in the database, possibly because of a runaway workflow.

 

0 0
replied on March 19, 2014

What's the best way for me to see if I have a runaway workflow??

0 0
replied on March 19, 2014

You can run a statistics report and see if any of the instance counts looks off from what you're expected for normal behavior.

1 0
replied on March 19, 2014

Thanks Miruna, I've not used that before.

 

Another giveaway sign is high CPU by the workflow services. When mine run away, the whole server suffers...

0 0
replied on March 19, 2014

That's interesting. I guess it would depend on what your workflows do. The most common symptoms I've seen are unusually high load on SQL or the LFServer, which in turn might bloat up the SQL log.

 

But like I said above, if you're backing up the LDF, I wouldn't worry about a runaway workflow unless there's unexpected or sudden growth on the LDF.

0 0
replied on March 19, 2014

Miruna,

 

My usual issue, when developing and testing, has been to forget to exclude the workflow account from starting rules. The second common symptom of high CPU usage for me has resulted from "for each..." loops that keep going; usually due to a mistake with a wait condition or decision branch. No great mysteries there.

 

I'm also curious about Daryl's workflows and if the logs are growing so much everyday or just haven't been truncated in a long time.

0 0

Replies

replied on March 18, 2014

Yes, the SQL .ldf file

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.