Planning Analytics and Transaction Log Management
If you are responsible for administrating IBM Planning Analytics, then the transaction log most likely has been queried regularly in an attempt to identify data changes (for example date and time of a change and name of the client that made the change) and – most Likely – the transaction log queries took hours to run, caused intermittent locking, and most of the time even yielded no results!
If that is the case, then implementing a method of managing Planning Analytics server transaction logs is highly recommended.
The objectives of transaction log management should be to:
- Limit transaction record creation to only specific cubes and only during relevant timespans
- Reduce the volume of the transactions logs by purging irrelevant data and expiring logs
- Consider performing transactional log searching external to Planning Analytics
- Explore configuring the transactional log search behavior
Limit Transaction Record Creation
Transaction logging can be enabled and disabled for specific cubes by setting the LOGGING column in the }CubeProperties cube. Transaction logging is typically needed for cube transactions on only selected cubes and during the periods of manual user data entry. Logging should never be enabled during a bulk upload or update. Typically logging should not be enabled on server control cubes or global support cubes accessed only by an administrator. The goal is to “tightly manage” which cubes have logging enabled and when transaction logging occurs on those cubes. One approach is to disable logging for ALL cubes within the instance and enable and disable logging on relevant cubes as part of an “open (or closed) for user input” set up by an administrator. When deciding on a logging strategy, if you are not using transactions for recovery, being more aggressive about disabling transactional logging makes sense.
Reduce the volume of the Transactions Logs
Invalid transaction records can be generated during normal server activity and these invalid records will impact log searches. Generated invalid records should be identified, reviewed, and cleared (from the log files).
In addition, if transaction logging enablement is being properly managed then the overall number of transactions being logged will be much less, reducing volume, if not, then there will be transactions saved on events that you have no interest in. Transactions are also created for a variety of events (other than individual user data changes), such as the “CubeSerialized” event (when a SaveData occurs) or when TurboIntegrator processes are run to modify data, resulting in additional transaction volume.
Having a scheduled process set up to parse or “filter” out transactions based upon need or other specific requirements can vastly improve the efficiency and value of this data. Lastly, a process should be in place to “expire” transaction log files which have passed relevancy.
Consider External Transactional Log Searching
Downloading and/or searching transaction logs as an administrator from within Planning Analytics may be slow and cause overall server performance degradation. In addition, the interface isn’t particularly flexible.
Another viable option for searching transactional log data is to utilize an external log searching tool such Splunk or Datadog or more informal “build your own” choices such as PowerShell or GREP. Finally, if you are properly managing your transaction log data, this data could be imported and reported on using a Planning Analytics cube and Planning Analytics Workspace.
Explore Configuring the Transactional Log Search
In newer versions of Planning Analytics, transactional log search behavior – such as changing the number of logs to search and the number of invalid logs that are allowed before a search stops – can be configured to change defaults. Some experimentation with log search behaviors may be required to get an optimal process and you’ll need to get IBM support involved to make these types of changes.
Conclusion
Even if you don’t plan to use transactions to restore previous values, transaction log data – like any data – needs care and feeding and if properly “cared for” you can use transaction log data to identify trends, understand what drives a particular type of data change within an area of a model, satisfy audit or compliance requirements, offer insight to the status of a workflow and of course use it to determine licensing requirements.