Playing with CAF(OBIEE Content Accelerator Framework)


I am right back after hectic day-to-day schedule ! Now I will try to explore a new OBIEE feature called Content Accelerator Framework (CAF) version 1 . It is a free utility available in OTN and yet not part of OBIEE 10g default installation .Possible reason could be , as this utility come in the mid way of transition between 10g and future version 11g .Hopefully Oracle will provide a polished automated CAF release bundled in 11g version to thrash out Legacy Web Replication method and will provide a full-proof intelligent migration tool.

What is CAF/The Objective and Purpose

It is a RPD and Catalog migration utility to replicate/clone and synchronize RPD objects and existing reports from Source environment to Target OBIEE environment. It also helps to create duplicate report and duplicate logical RPD constructs from one OBIEE environment to other even if source and target environment has nothing in common and only basic logical model at Target environment .This in-turn help to propagate the changes made on RPD /Catalog  from one environment to other .This could even done after selecting each and individual object.

Overcome from Legacy Web Replication

Webcat replication seems to be an old school method while CAF approaches to make a revolutionary change in Migration aspects .The tool is simply amazing with its wide variety of features . As we all know Webcat Replication is not a very robust method while doing the synchronization. There are lot of glitches while anybody try to adopt this .Not only this , the configuration and network based setup is very much important for doing a successful webcat replication which sometime seems to be a headache for a Configuration Manager .However it is very good in clustered based synchronization arena and not typically in migration .In CAF approach the migration task would be much more simpler and not so cumbersome to handle .As the entire Catalog based concept changes from legacy .webcat file after Oracle takeover the product ,it is predictable that Replication strategy need to change as well . Lets see what OBIEE11g comes with !!!

Why this Blog Thread

There are lots of so-called OBIEE Guru’s already blogged about CAF . However my intension is to blog it with nitty-gritty after deep diving so that readers could feel how it works across various Proof Of Concept approaches.

Installation Prerequisite

1) OBIEE 10.1.3.4 and later (not OBIEE 11g !)

2) JDK 1.6 or later (Note : Make sure the JDK version is correct and correct PATH entry is there as Environment variable parameter)

Installation and Configuration

Download CAF and extract it under below path .Note that don’t rename the Temporary subdirectory “C:\OracleBI\TEMP_CAFV1_Install” and the extracted content should be under the mentioned path and directory to avoid any issues.

Now make sure OBI Catalog manager is not running and all the process is killed i.e. “catalogmanager.exe, java.exe, javaw.exe” . Otherwise unwanted JAR error will be thrown.

Navigate to the directory ..\OracleBI\web\catalogmanager\plugins.

Create a backup of the file com.siebel.analytics.web.catalogmanager_1.0.0.jar and move the backup file outside of the OBIEE directory structure.
Delete the original com.siebel.analytics.web.catalogmanager_1.0.0.jar file from ..\OracleBI\web\catalogmanager\plugins.
Lets try to open Catalog Manager by selecting Start > All Programs > Oracle Business Intelligence > Catalog Manager. Catalog Manager should not open and you should receive an error message similar to the following:

Now run the batch file from the extracted directory and continue it by pressing “Y” .

Now try to open Catalog Manager once again and you can see two new option added as ‘Syncronize’ and ‘Clone’ . This indicates that CAF has been installed and configured successfully .

So lets discuss about how functionally it is working across several perspective in a step by step different kind of approaches.

Approach 1 – Cloning Catalog Object from Dev to Test Environment

CAF Cloner Framework

I have created a RPD called dev_rpd , duplicate it and rename it as test_rpd .The dev_rpd will be used as Source RPD and test_rpd as Target RPD .Both RPD’s having same SA and same metadata objects configured based on Default ‘EMP’ and ‘DEPT’ table  joins .A report and prompt is created in ‘dev_catalog’ .This will be used as source of target catalog ‘test_catalog’ .The Online RPD is dev_rpd and it has been copied to “\OracleBI\server\Repository\dev” to make an offline access to it.Another offline RPD is test_rpd and kept under “\OracleBI\server\Repository\test” .

Lets configure instanceconfig.xml to point “test_catalog” .Start the Presentation services .Initially it will be blank catalog .We will see how to synchronize dev_catalog items to online test_catalog now.

Open the Catalog Manager and open dev_catalog offline and Click on Clone .That does mean I want only Emp report to be cloned in target blank web catalog i.e. test_catalog .

It will pop-up content accelerator wizard .If the Target Catalog is not online it will throw below error :

Otherwise move to the next screen to Select Source and Target offline RPD location and Online Target Catalog URL prefix (BI service plugin ‘analytics’ should be excluded)

Now choose the Target Subject area where you want the items to be migrated .

Click Next .This will parse the Emp request/report XML and extract the columns from it to display at the left-pane.Whereas at the Right pane it selects the Presentation tables from Target RPD .

Below I have clicked EMPNO from left pane and double clicked to the EMPNO from EMP table at right pane to get the final Mapped Values .As you can see after hovering the mouse you can see the Data Type of the target mapped column.This datatype will be extracted and identified from Target RPD automatically .

Any incorrect mapping will throw below error :

Do all the mapping as below :

Lets change the target request name from default “/shared/Cloned” to make it in sync with source catalog .Check the “Create Dashboard” option to make a new dashboard “emp_dashboard” at target catalog .(You can skip Dashboard creation option,in that  case only Answers request will be Cloned at Target) .Also please note that we have same named dashboard “emp_dashboard” at “dev_catalog” and same report emp added under that dashboard which we want to cloned in same name at Target catalog “test_catalog” .

You can edit the target request name also .Finally click on Finish. This results below successful cloning log .

Now lets login to Online ‘test_catalog’ and I can see a new dashboard created having the same report.Also it is available in Answers .

So the Cloning has been succeeded from Source Development environment to Target Test environment.

You can notice 2 things : Target RPD test_rpd has been renamed to test_rpd_backup before change and target catalog report description has been changed i.e. marked red in above image .How pretty and careful CAF is !!!

To be Continued ……….

Updated as of 17-Feb-2011 … I am not resuming this thread anymore as new OBIEE 11g comes with enhanced migration capability without CAF …

OBIEE Magic Trick !!!


Anybody ever face the typical enhancement requirements on existing dashboard reports about enabling the column sorting for all Tabular reports ? I’m sure ,definitely somebody have faced this . However ,I have faced this kind of new requirements from one of my recent client .

Dashboard Column Sorting

Think of , you have 500 reports already in place and it is very tedious job to hover across all reports individually and put a tick against the “Enable column sorting in dashboards” option .Though this is very simple and straightforward  however too irritating and hectic for developers as he/she has to modify each report and for doing this he/she need to click 7-8 times from editing till saving the request .So altogether you are saving 4000 click and overall a substantial minutes 🙂 Don’t go for a Rapid Fire round !

So how it will be , if  we think of a tweak which will make developers free from this boring and hectic activities of monotonous job . Yes I know , some of you are thinking smartly to make a change on Stylesheet /msgdb XML templates or some tweak on instanceconfig.xml .

Fortunately I came to know that ,it could be achievable  from ” Global Sorting of Dashboard by changing XML ” i.e modifying answerstemplate.XML file from location \\OracleBI\Web\msgdb\messages .

Dashboard Column Sorting2

Only thing you have to do is ,search below strings from answerstemplate.XML file (before change)

<WebMessage name=”kuiCriteriaDefaultViewElements” translate=”no”>
Dashboard Column Sorting3 And replace with below (after change) .

<WebMessage name=”kuiCriteriaDefaultViewElements” translate=”no”><HTML><view signature=”tableView” sortable=”true” /></HTML></WebMessage>

Dashboard Column Sorting4 This has been well documented by Oracle .See this link . After doing it and restarting your services your reports should be automatically sorted and if you edit the request it would by default check the Sorting enable option .

Unfortunately this is not the end of story and that is why it is :

—————— xxxxxx A MAGIC THREAD  xxxxxx ——————-

I have implemented the above solution in my OBIEE 10.1.3.4 version and unfortunately  it doesn’t work good . So I have raised an ORACLE SR and it reveals that ,it is a product bug and hence I presume a Documentation bug .An Enhancement Request (ER) 6619910 has been currently with Oracle Support (P3/P4 Bug 6619910) to be considered for review by Development team for future bug fix release .

Alas ! what will be then about my 500 reports sorting in a moment ?? Yes ,there is way and don’t put your finger cross 🙂

Open your Shared folders using Catalog Manager and do “XML Search and Replace ”  as below in offline mode :

Replace  name=”tableView!1″ rptViewVers=”200510010″ with
name=”tableView!1″ rptViewVers=”200510010″ sortable=”true” . Just restart Presentation Services .

Dashboard Column Sorting5

Voila …. It’s all magic see below ! Yes, all reports have been Sorted dynamically in dashboard within couple of seconds . Note that the above method is not documented and hence do it at your own risk .Suggestion is to keep the backup of catalog before performing the steps .

Dashboard Column Sorting6

Now , just think of how much effort you have saved to enable sort for your dashboard reports … Like a MAGIC  and less tricky so far!   🙂 🙂 🙂

Caching – A Cacophony in OBIEE Tuning !!!


Well ! A very interesting and controversial topic indeed , typically for Group Discussions . OBIEE  is not very much popular yet in OLAP Caching arena .There is hell lot of debates humming around regarding the BI server caching and obviously the aspects of performance tuning in effect of that .There are couple of benefits achievable in performance arena in true sense though sacrificing with couple of trade-offs .At the end of that day basic is , do Caching enable but do judiciously and make sure it would not get you into trouble in future . So need to configure the Caching considering substantial benefits as opposed to the trade-off. Here I am only focused on OBI server(SAS) level caching rather OBAW server(SAW).

Introduction

OBI has the ability to Cache query results, such that submitting the earlier processed requests do not pass through to the database and process it from filesystem.

Caching Framework

OBIEE Caching Concept

Benefits

Providing significant performance improvements as it frees up the database resources to perform other tasks and thus DB can entertain other set of user queries .

Native filesystem should perform better query processing and retrieve fast results comparing the data to and fro communication with database in network channel .

Moreover it conserves network resources by avoiding  connection to the database server so always less network traffic engagement could be markable .

This also reduces the  BI Server Engine processing overhead over queues of user requests .

However all the above things are so nice to comprehend and so difficult to manage  as providing the flexibility of faster response would lead to users an inconsistent state of data if not properly managed and planned .

This management and planning have several glitches and lots of dependencies and it is called as Cache Purging as opposed to its counterpart as Cache Seeding .

Cache Management Techniques

Do change Cache configurable parameters first .Note if disk space for Cached file exceed its limit ,entries least recently used(LRU) will be discarded and replace automatically to make room for new entries .Below mentioned techniques could be broadly distinguished as Manual , Automatic and Programmatic way to manage Caching :

Cache Seeding

There are several means to do this business :

1)  Set Global Cache parameter on – This will cache query running on any physical tables .By default for all tables in repository is cacheable .

2) Switch on Cacheable  property – This will provides table level benefit and extra customisation that which Physical tables should participate in generating query cache . E.g : sometime user would be more interested on giant Fact table caching rather tiny dimension tables .

3) Scheduling iBot – iBot could be properly configured and used for cache seeding purpose .This will silently build the cache without having any manual intervention . Possibly triggered in a time window after daily ETL load finishes or can be further customised and automated based on result retrieved from another iBot in chain request .The second iBot necessary ping the DB to identify whether a database update has been done(after ETL finish) before passing the request to trigger its chained counterpart .This will build data and query cache for a dashboard request and not for entire set of tables .

4)  Running nQCmd utility :

Another fantastic way to handle query caching which doesn’t have any dependency on the ETL load .But overhead to accumulate the superset of the actual physical query needs to be fired against a request /report and put it down in a single file to pass as parameter of nQCmd.exe . This necessarily need to be invoked after ETL run and by the ETL job itself .It could be done using remote login to BI server and trigger nQcmd  automatically and thus iBot scheduling time dependency could be avoided .

Cache Purging

A very important and crucial mechanism which should be proven good and perfect to make Caching a success story :

1) Manual  purging – Usually a dedicated dumb Administrating job ,kind an overhead for a company .This could be done simply by deleting the existing Cache TBL files or by firing purge from BI Admin Tool .This purging could be done by categories i.e. (Repository , Subject areas , Users or by physical tables) from Admintool in online mode .

2) Calling ODBC Extension – Bundled ODBC extension function like SAPurgeCacheByDatabase() ,SAPurgeCacheByQuery(),SAPurgeCacheByTable(),SAPurgeAllCache() etc . could be called to free the cache table for specific queries, tables,database or all .See Oracle documentation for details .This should be called using nQCMD utility and just after ETL load and before Cache seed to ensure there is no time window related gap and dependency .

3) Event Polling table – A nice and robust concept but not so nice to manage and lead to extra overhead .Really a good technique to make BI server aware of that DB update done and now carry forward to do your business of purging . A Cache Polling frequency is an important step and should be predefined and robust to make it a success .Poll table will be populated by a auto insert DB Trigger each time target DB tables updated .Analytics server polls that table at specific set of intervals and invalidates cache entries corresponding to updated tables.

4) Calling iBots to Purge Cache – It could be done by calling a custom java scripts .This in turn call nQCmd and ODBC extension to free cache .However the catch in this feature is again the iBots need to be scheduled just after ETL run and before Cache seed . So you might not sure about stale data if  ETL doesn’t meet the SLA .Again this could be done after setting a chained iBots to trigger the Purging activity in proper time .So ground rule is that never rely on iBot schedule on time.Lets pick the status from DB to trigger it .

Trade-off

Not purging the outdated caches , known as Stale Caches , can potentially return inaccurate results over time .Think of a situation where your Cache didn’t get purged on time after ETL load finishes. In this scenario though database has been updated but the change is not going to be reflected in your cached data as the seeded cache having outdated data at your filesystem and thus results a stale data which would throw inconsistent and wrong result .This potentially will cause huge confusion to the users mind .Thus Cache retention ,refresh time is important.

Not only that,in large platform , Caching should be separated and spreaded across  multiple folders/mountpoints for better utilization of I/O across the filesystems .The query Cache storage should be on local, high-performance ,high-reliable storage devices .The size of consumption of the Cached files would be a bottleneck for performance improvement across disk usage space .It should be used with proper Caching replacement algorithm and the policy towards the number of maximum cache entries  defined under NQSCONFIG.ini file . A potential Cache related problem found in Clustered environment where the Cache files build on one native host is not sharable with other one and this leads to be the same cache build across clustered participants as it could not be ensured that which BI server will handle which request .Until and unless the request get processed Cluster participant can’t determine there was already some Cache hit based on the query generated  and the request need not to be processed from other Clustered BI server .Again Purging need to be done from both Clustered servers .Cluster Aware Cache is propagated across Clustered Servers only when Cache is seeded via iBots and not using general dashboard request or answer based queries . So finally if you understand your business and requirements you could achieve success using Cache management by availing any of the above techniques .But again , be cautious ,Caching is not so reliable candidate at least from my experience .Hope Oracle will surely looks upon it to make it robust and non-debatable . Wish you the best and good luck with Cache implementation 🙂 🙂

 

 

Presentation Catalog/Services Caching


As we know there are two kind of Caching is available in OBIEE .One of them is most popular and familiar i.e. BI Server Caching and other is BI Web Server caching . Presentation Catalog Caching is rarely used and it rarely take any attraction in performance . However a most significant use of it will be, bypassing the default report level Caching in real-time BI application i.e typical Transactional Application where the back-end data changed in a moment and need to be reflected in the Analytics report (though BI is not built for OLTP query)  .In such scenario it would be meaningful for developers to force bypass the web query caching and let analytics hit the database once again to fetch updated data , if any .This could be manually done by pressing Refresh button at the bottom left of Browser window .

However if there are multiple dashboards and say user1 wants dashboard1 data updated on real-time after back-end ETL populates, then rather suggesting user to click on Refresh button each time is not the best option as fickle minded user1 could see stale data and can raise a hue and cry unnecessarily on old data .That could be easily avoided using Bypass Web Server Cache feature in Advanced tab of the corresponding reports .

Bypass NQQuery Cache

This feature would not only giving the user  flexibilities to customize the set of reports where the real-time data reflection needed but also avoiding a global change in cache config params under instanceconfig.xml which will be applicable for entire platform and possibly that could not be the right choice at right time !

If you are running multiple instances of Oracle BI Presentation Services (either through replication or clustering) you need to add the following entries in the Oracle BI Presentation Services configuration file (instanceconfig.xml) to manage when the Presentation Catalog cache is to be updated from disk .
<Catalog>
<AccountIndexRefreshSecs>120</AccountIndexRefreshSecs>
<AccountCacheTimeoutSecs>180</AccountCacheTimeoutSecs>
<PrivilegeCacheTimeoutSecs>180</PrivilegeCacheTimeoutSecs>
<CacheTimeoutSecs>120</CacheTimeoutSecs>
<CacheCleanupSecs>600</CacheCleanupSecs>
</Catalog>

OBIEE “Act As Proxy” Feature


Right ! I am back with a article of  OBIEE 10g  feature called “Proxy User” .Also known as “Act As Proxy” .

So here we will learn about the usage of  Web Catalog Privileges option as below :

Act As Proxy0

The background of this utility is ,using this feature One user can act a proxy user and can give a proxy on behalf of other user here called Target user ,such a way that Proxy user can edit ,manipulate the request created by Target user . Being a Administrator ,he has to assign the Proxy permission to the group of people who will act as Proxy User.Thus Administrator will be delegating the permissions .Which proxy user will access which Target user report could be determined by the mapping defined under back-end table. So whatever the Report and Dashboard the Target user have access would be visualized by the Proxy User .

Sometime Global Administrator user want to see the reports of the target users for debugging purpose.In this scenario he/she might be interested for acting as Proxy User and assign the Act As Proxy for him/her . This is a typical scenario where this feature is suitable.

Below specifies the Proxy feature in step by step approach :

1) For my case I have build a repository using the SCOTT schema ‘EMP’ and ‘DEPT’ table .Apart from that I have created a PROXY_TABLE to maintain a relationship between Proxy and Target user list as below.Added that table in my repository .

Act As Proxy1

So here ACCESS_TYPE is a column which takes 2 values — full or restricted. If full access is provided, then the Proxyuser list would have edit/read privileges of the Target user Dashboards/Pages/Reports (N.B :- Proxyuser himself should have those read/edit privileges). If restricted access is provided, then the Proxyuser would only have read privileges (provided the proxy user himself has those privileges).

2)  My RPD configuration as below :  I have also created the  6 users  in RPD  appeared above  among  which ‘proxyuser1’ , ‘proxyuser2’ , ‘proxyuser3’ have been added under user group ‘Proxy_User_Groups’ . This user group I  have to create in Web Catalog as well  so that  Act As Proxy can  only be given to that group .

Act As Proxy23) Now we have to create the list of Session Initialisation Block and Variable to fetch the target user list based on the logged in Proxy user .

Act As Proxy3

Hence I create Init Block ‘Set_RUNAS’ which will populate a Non-System Session variable RUNAS defined later in sqltemplate file and it is acting as a parameter to that custom Logon template(see point 7 XML code) file .Since we will pass this variable in next Init block hence lets populate it using a dummy SQL as :

SELECT ‘SOMETHING’ FROM DUAL

Act As Proxy4

4) Lets initialize a System Session Variable ‘PROXY’ using the query below and set the proper execution precedence.

SELECT TARGET_USER FROM PROXY_TABLE
WHERE TARGET_USER = ‘VALUEOF(NQ_SESSION.RUNAS)’
AND PROXY_USER = ‘:USER’

Act As Proxy5 5) Initialize Proxy Level variable as Non-System Session variable  to define the access type using below SQL :

SELECT ACCESS_TYPE FROM PROXY_TABLE
WHERE TARGET_USER=’VALUEOF(NQ_SESSION.RUNAS)’ AND PROXY_USER = ‘:USER’

Act As Proxy66) Now lets put a instanceconfig.xml entry below just before tag </ServerInstance>

<LogonParam>
<TemplateMessageName>ProxyLogonParamTemplate</TemplateMessageName>
<MaxValues>200</MaxValues>
</LogonParam>

This will ensure that a new message template can be called where template name is “ProxyLogonParamTemplate” and the Max Value specifies maximum how many user can be defined as “Act As” .

Act As Proxy7

7) We need to define the custom message template as below and save the file as “ProxyUser.xml’ under “..\OracleBI\web\msgdb\customMessages” .

This custom message template for the proxy functionality contains the SQL to get the list of target users that a proxy user can Act As. This list appears in the User box in the Act As dialog box.

Act As Proxy8

<?xml version=”1.0″ encoding=”utf-8″ ?>
<WebMessageTables xmlns:sawm=”com.siebel.analytics.web.messageSystem”>
<WebMessageTable system=”SecurityTemplates” table=”Messages” translate=”no”>
<WebMessage name=”ProxyLogonParamTemplate”>
<XML>
<logonParam name=”RUNAS“>
<getValues>EXECUTE PHYSICAL CONNECTION POOL oradeba.Proxy_Conn
select TARGET_USER from PROXY_TABLE where PROXY_USER=’@{USERID}'</getValues>
<verifyValue> EXECUTE PHYSICAL CONNECTION POOL oradeba.Proxy_Conn
select TARGET_USER from PROXY_TABLE where PROXY_USER=’@{USERID}’ and TARGET_USER=’@{VALUE}'</verifyValue>
<getDelegateUsers>EXECUTE PHYSICAL CONNECTION POOL oradeba.Proxy_Conn
select PROXY_USER, ACCESS_TYPE from PROXY_TABLE where TARGET_USER=’@{USERID}’
</getDelegateUsers>
</logonParam>
</XML>
</WebMessage>
</WebMessageTable>
</WebMessageTables>

8 ) Now Restart BI server and Presentation server .

9) Using targetuser1 , targetuser2 I have developed two reports .So that when proxyuser1 logged in he/she would be able to Act as targetuser1 and can see the reports developed by targetuser1.Similarly targetuser2 report is also delegated to proxyuser2 .proxyuser1 could edit/change the targetuser1 reports but proxyuser2 would not be able to modify targetuser2 reports .

10)  Using Administrator login Goto Manage Privileges and under “Act As Proxy” set the Proxy user group .For my case, I have Granted the privileges to ” Proxy_User_Groups ” .

Act As Proxy9 11) I have logged in as proxyuser1 and landed to below reports which is proxyuser1’s own report not targetuser1 report. Now click on “Set As” under Settings link and you will get a small window open which will be populated using ‘targetuser1’ .As much as targetuser defined it will be get populated in this drop down list .

Act As Proxy10Act As Proxy11

12) Now clicked “ok” and landed to the report developed by ‘targetuser1’ in his/her own dashboard .As the ‘Edit Dashboard’ option is enabled hence you can do any modification here and save the report. Next time when ‘targetuser1’ will log into system he/she can easily find the changes done by someone ! here by its Proxy .

Act As Proxy1213) Now logout from ‘proxyuser1’ and logged in as ‘proxyuser2’ . Go to Settings and clicked on “Set As” .After that ‘targetuser2’ should be selected .Click “ok” .Now you can see ‘targetuser2’ dashboard reports .Also ‘Edit option’ is disabled since “proxyuser2” has “restricted” access level defined under PROXY_TABLE .

Act As Proxy1314) So thats all about OBIEE Proxy user feature ! So small but so robust … 🙂