Monday, March 21, 2011

Poor man's Siebel monitor

Are you a Siebel administrator and need to know which are the busiest hours on your Siebel Servers?

Are you a project team lead and want an overview of the most popular Siebel views?

Are you a support engineer and need a list of users running expensive queries?

Are you a Siebel developer and wish to compare system performance before and after your configuration changes?
You don't want to spend extra money on this?

If you answer any of the above questions with "Yes", then you should really consider turning on SARM.

I am still surprised how little attention SARM (splendid blog post by Chung Wu here) gets in the Siebel projects. It is easy to enable, it doesn't hamper your overall performance and you can exactly control the amount of disk space needed for the SARM files. If you object to this, please read on ;-)

By using a command similar to the following, we can enable SARM in our Siebel environment:

change param SARMLevel=2 for comp OURObjMgr_enu
change param SARMPeriod=1 for comp OURObjMgr_enu
change param SARMThreshold=100 for comp OURObjMgr_enu
change param SARMMaxFiles=5 for comp OURObjMgr_enu

I stated above that SARM doesn't hamper system performance. By setting the SARMThreshold parameter to 100 (milliseconds) you avoid saving events to SARM files which take 100 ms or less in total execution time. This greatly reduces the amount of data at the time of writing to the disk (SARM IO) and later when it comes to analyzing SARM data with sarmquery.

By using the SARMMaxFiles parameter we can control how many SARM files are written for each process before the oldest of the files is recycled.

So how can we answer the questions above?

After SARM has been enabled, the server processes start producing SARM files. I typically use scripts similar to the following to copy the files to a different machine, produce output suitable for human eyes and display it in a browser:

1. Copy SARM files to a central location

We could use the SARMLogDir parameter to write the SARM files directly to e.g. a shared directory but I try to avoid this because of the time lag added by the network while writing the files.
So I use a copy command in a scheduled batch file like the following to transfer all SARM files to a directory on a machine which has enough resources to serve the sarmquery utility:

xcopy \\serverhost\siebelroot\siebsrvr\log\*.sarm C:\Temp\sarmdata /S /Y

This will collect all SARM files in the sarmdata directory

2. Produce SARM reports

Now we can run sarmquery against the collected data. This post focuses on a simple SARM monitoring solution. If you wish to learn more about sarmquery please refer to the documentation. The following sarmquery commands produce pseudo-plots which allow a quick glance at the current state of the system. We use the OS' shell output instruments to write the output to text files on the web server directory.

sarmquery -input C:\Temp\sarmdata -agg time=15 > C:\Inetpub\wwwroot\sarm\time.txt
sarmquery -input C:\Temp\sarmdata -agg user > C:\Inetpub\wwwroot\sarm\users.txt
sarmquery -input C:\Temp\sarmdata -agg subarea > C:\Inetpub\wwwroot\sarm\subareas.txt
sarmquery -input C:\Temp\sarmdata -agg instance -sel subarea=swepage_view_build > C:\Inetpub\wwwroot\sarm\views.txt

All scripts can be run in the OS' scheduler every 5 minutes, so the files are refreshed in this interval.

3. Display SARM reports in a browser

To enable all team members to easily access the SARM reports, we can create a simple html file. Good ol' frames do a world of good here. Set the meta refresh tag to 5 minutes and you have a self-refreshing system monitor at the lowest possible cost.

<html><head><title>SARM Performance Monitor</title>
<meta http-equiv="refresh" content="300"></head>
<frameset border="1" bordercolor="#CCCCCC" rows="25%,25%,25%,25%">
<frame src="time.txt" name="time">
<frame src="users.txt" name="users">
<frame src="views.txt" name="views">
<frame src="subareas.txt" name="subareas">

This might not be for the faint of heart, but this is what it looks like:

click to enlarge

4. Go figure

We have set the stage. Now you can expand. What would you do next? Which reports will you provide? Please use the comments.

have a nice day



Chung Wu said...


Thanks for your complement. I posted a response on my blog.


ns said...

This sounds exciting.. i am going to implement this right away. Thanks Alex and Chung Wu for sharing this.

Anonymous said...

Hi Alex,
Nice post!
I would create a reports to show how all custom workflows/scripts are performing.

-select area=workflow -select instance="ABC*" -aggregate instance

-select area=script -select instance="ABC*" -aggregate instance


OdEd said...

Good job. I've been pushing SARM for a long long time as a good starting point - and you would be so surprised how few Siebel experts have ever paid it any attention. Great article, I have linked to it from our Facebook Group

coco said...

Do you have an explanation why I’m getting different results for the same user id,
when comparing the view invocations in sarm logfiles and the usage tracking log files for the same period of time?

coco said...

Hi Alex

Do you have an explanation why I’m getting different results for the same user id,
when comparing the view invocations in sarm logfiles and the usage tracking log files for the same period of time?

Thanks & Regards

sai kishor said...

Hi Alex, can you please let me know the usage of the parameter :CreateSupplementalFile:True

Alexander Hansal said...

Hi Sai,

I believe you're referring to the input argument for the InsertRecord method of the Inbound E-mail Database Operations business service.

Documentation on this parameter is non-existent and while I cannot confirm this it has to do with controlling attachment creation.

have a nice day