Are you a project team lead and want an overview of the most popular Siebel views?
Are you a support engineer and need a list of users running expensive queries?
Are you a Siebel developer and wish to compare system performance before and after your configuration changes?
You don't want to spend extra money on this?
If you answer any of the above questions with "Yes", then you should really consider turning on SARM.
I am still surprised how little attention SARM (splendid blog post by Chung Wu here) gets in the Siebel projects. It is easy to enable, it doesn't hamper your overall performance and you can exactly control the amount of disk space needed for the SARM files. If you object to this, please read on ;-)
By using a command similar to the following, we can enable SARM in our Siebel environment:
change param SARMLevel=2 for comp OURObjMgr_enu
change param SARMPeriod=1 for comp OURObjMgr_enu
change param SARMThreshold=100 for comp OURObjMgr_enu
change param SARMMaxFiles=5 for comp OURObjMgr_enu
I stated above that SARM doesn't hamper system performance. By setting the SARMThreshold parameter to 100 (milliseconds) you avoid saving events to SARM files which take 100 ms or less in total execution time. This greatly reduces the amount of data at the time of writing to the disk (SARM IO) and later when it comes to analyzing SARM data with sarmquery.
By using the SARMMaxFiles parameter we can control how many SARM files are written for each process before the oldest of the files is recycled.
So how can we answer the questions above?
After SARM has been enabled, the server processes start producing SARM files. I typically use scripts similar to the following to copy the files to a different machine, produce output suitable for human eyes and display it in a browser:
1. Copy SARM files to a central location
We could use the SARMLogDir parameter to write the SARM files directly to e.g. a shared directory but I try to avoid this because of the time lag added by the network while writing the files.
So I use a copy command in a scheduled batch file like the following to transfer all SARM files to a directory on a machine which has enough resources to serve the sarmquery utility:
xcopy \\serverhost\siebelroot\siebsrvr\log\*.sarm C:\Temp\sarmdata /S /Y
This will collect all SARM files in the sarmdata directory
2. Produce SARM reports
Now we can run sarmquery against the collected data. This post focuses on a simple SARM monitoring solution. If you wish to learn more about sarmquery please refer to the documentation. The following sarmquery commands produce pseudo-plots which allow a quick glance at the current state of the system. We use the OS' shell output instruments to write the output to text files on the web server directory.
sarmquery -input C:\Temp\sarmdata -agg time=15 > C:\Inetpub\wwwroot\sarm\time.txt
sarmquery -input C:\Temp\sarmdata -agg user > C:\Inetpub\wwwroot\sarm\users.txt
sarmquery -input C:\Temp\sarmdata -agg subarea > C:\Inetpub\wwwroot\sarm\subareas.txt
sarmquery -input C:\Temp\sarmdata -agg instance -sel subarea=swepage_view_build > C:\Inetpub\wwwroot\sarm\views.txt
All scripts can be run in the OS' scheduler every 5 minutes, so the files are refreshed in this interval.
3. Display SARM reports in a browser
To enable all team members to easily access the SARM reports, we can create a simple html file. Good ol' frames do a world of good here. Set the meta refresh tag to 5 minutes and you have a self-refreshing system monitor at the lowest possible cost.
<html><head><title>SARM Performance Monitor</title>
<meta http-equiv="refresh" content="300"></head>
<frameset border="1" bordercolor="#CCCCCC" rows="25%,25%,25%,25%">
<frame src="time.txt" name="time">
<frame src="users.txt" name="users">
<frame src="views.txt" name="views">
<frame src="subareas.txt" name="subareas">
This might not be for the faint of heart, but this is what it looks like:
|click to enlarge|
4. Go figure
We have set the stage. Now you can expand. What would you do next? Which reports will you provide? Please use the comments.
have a nice day