Wednesday, June 27, 2012

OBIEE Default security groups for BI Publisher
If you are using OBIEE and BI Publisher side by side using shared security model (XDO), you wil need to create some default users groups:


These groups are used by the XMLP server:
    XMLP_ADMIN
    XMLP_DEVELOPER
    XMLP_SCHEDULER
    XMLP_ANALYZER_EXCEL
    XMLP_ANALYZER_ONLINE
    XMLP_TEMPLATE_DESINGER
If you are working with multi user repository development remember to assign the groups at least to one project, or the group will be lost at check in.
OBIEE Default security groups for BI Publisher
If you are using OBIEE and BI Publisher side by side using shared security model (XDO), you wil need to create some default users groups:
                                                                       


These groups are used by the XMLP server:

    XMLP_ADMIN
    XMLP_DEVELOPER
    XMLP_SCHEDULER
    XMLP_ANALYZER_EXCEL
    XMLP_ANALYZER_ONLINE
    XMLP_TEMPLATE_DESINGER

If you are working with multi user repository development remember to assign the groups at least to one project, or the group will be lost at check in.


Tuesday, June 26, 2012

OBIEE Grand Totals with Calculated Columns (repost)

OBIEE Grand Totals with Calculated Columns (repost)
This is a bit of a specific problem, but I could see it being something that those new to OBIEE could run into. It deals with incorrect Grand Total values being displayed for a Calculated Column when using report filters.
Consider the following report:
unfiltered_unagg
In this report, Amount A and Amount B are standard columns in a database table, and Variance is a calculated column in the repository. The calculation for Variance is 100 * (Amount B - Amount A) / Amount A. So, going by the Grand Total amounts in the report above, 100 * (696 - 550) / 550 = 26.55.
Suppose you want to filter the results of this request so that it only shows items having a Variance greater than 30:
filtered_unagg
Note the Grand Total for Amount A and Amount B are summed correctly for the filter, but the Variance still shows 26.55. Obviously, this is incorrect, so what’s going on here? The filtered amount should be reported as 100 * (220 - 130) / 130 = 69.23.
If you want to show the correct variance for the filtered Grand Total amounts, you need to enable an option in your instanceconfig.xml file. Open the file $OracleBIData/web/config/instanceconfig.xml, and add the following line somewhere inside your <ServerInstance> and </ServerInstance> section:

    <ReportAggregateEnabled>true</ReportAggregateEnabled>

    Make sure you don’t accidentally put this inside an element nested inside <ServerInstance>, or it won’t work.

Save instanceconfig.xml, and restart your BI Server and Presentation service.
Now, the correct amount should be shown for the filtered Grand Total variance:
filtered_agg
This was previously a bug in OBIEE that was addressed by adding the ReportAggregateEnabled option. It’s not really documented anywhere other than on Metalink, so hopefully this will be helpful to someone.
This article was original posted on the Kevin C. oraclebi blog. See: http://obiee101.blogspot.com/2009/09/obiee-blog-lost.html

Monday, June 11, 2012

Setting up usage tracking

OBIEE has a very good usage tracking system, with low resource costs. Here is how you set up the Oracle variant.
Step 1: create a oracle schema:

CREATE USER obiee_usage_tracking IDENTIFIED BY "OBIEE_USAGE_TRACKING"
DEFAULT TABLESPACE "USERS"
TEMPORARY TABLESPACE "TEMP"
PROFILE DEFAULT
QUOTA UNLIMITED ON "USERS";
GRANT "CONNECT" TO obiee_usage_tracking;
GRANT "RESOURCE" TO obiee_usage_tracking;
GRANT CREATE SESSION TO obiee_usage_tracking;
GRANT CREATE TABLE TO obiee_usage_tracking;
GRANT CREATE VIEW TO obiee_usage_tracking;
ALTER USER obiee_usage_tracking DEFAULT ROLE NONE;

Step 2: Create the tracking table.

the Tracking table Script can be found in: ... \OracleBI\server\Schema
Use SAACCT.Oracle.sql for Oracle and put it in the relevant OBIEE_USAGE_TRACKING schema.

Step 3: Make it available for the public
GRANT SELECT ON S_NQ_ACCT TO PUBLIC;

Step 4: Setting up the additional tables
You can find the scripts in ...\OBIEE\OracleBI\server\Sample\usagetracking\SQL_Server_Time

Run the following scripts;
Oracle_create_nQ_Calendar.sql
Oracle_create_nQ_Clock.sql
Oracle_nQ_Calendar.sql (Warning this only goes to 2016!)
Oracle_nQ_Clock.sql

Step 5: Make it available for the public

GRANT SELECT ON S_ETL_DAY TO PUBLIC;
GRANT SELECT ON S_ETL_TIME_DAY TO PUBLIC;


Step 5b: Create an extra view
(They forgot this one in the documentation)

CREATE OR REPLACE VIEW nq_login_group AS SELECT DISTINCT user_name AS login, user_name AS resp FROM s_nq_acct;
grant select on nq_login_group to public;

Step 6: Merge the usage tracking RPD into your master RPD.

Open your master RPD offline.

Save as "master_temp.rpd"

From the file menu select merge.

Select you original master.RPD
Select the modified repository from ...\OracleBI\server\Sample\usagetracking\Password is empty
Click MERGE
If you get this one don't worry about it, just click okSave the merged RPD as your master.rpd
Step 7: Move the Usage Tracking to a native connection pool If you ommit this and use ODBC you can get "strange" errors. 
Import 1 table from the OBIEE_USAGE_TRACKING schema.

Delete only the table and copy the connection pool

Drag and drop the table part from the imported Usage tracking rpd.

Rename the databasebase an connection pools

Check the connection pool data
Be sure to set both connection pools!



Step 8: Altering the NQSConfig.ini

The file can be found in: …\OracleBI\server\Config

Locate and Alter the following:
###################
# Usage Tracking Section
# Collect usage statistics on each logical query submitted to the
# server.
###################
[ USAGE_TRACKING ]
ENABLE = YES;
DIRECT_INSERT = YES;
PHYSICAL_TABLE_NAME = "OBI Usage Tracking"."Catalog"."dbo"."S_NQ_ACCT";
CONNECTION_POOL = "OBI Usage Tracking"."Usage Tracking Writer Connection Pool" ;
BUFFER_SIZE = 10 MB ;
BUFFER_TIME_LIMIT_SECONDS = 5 ;
NUM_INSERT_THREADS = 5 ;
MAX_INSERTS_PER_TRANSACTION = 1 ;

Step 9:
Restart the BI server. Check the server log for errors.

Step 10: Import the Usage Tracking Presentation Catalog into the existing Presentation Catalog. Extract the example catalog from the ZIP file found in ...\OracleBI\server\Sample\usagetracking\

Open one instance of the Oracle Business Intelligence Catalog Manager. Open the source Presentation Catalog from where we can copy the Usage Tracking content.

Open this Presentation Catalog in offline Mode.

Once you have opened the presentation catalog, click on the shared folder. In this folder you can find the folder Usage Tracking, which holds all related reports, filters and the dashboard page.


Open another instance of the Oracle Business Intelligence Catalog Manager.
Open the destination Presentation Catalog in online mode to which you want to copy the Usage Tracking content to. Open the shared folder. As you can see in the example below, this presentation catalog doesn’t have the Usage Tracking folder yet.

Go back to the source Presentation Catalog for the Usage tracking. Click on the Usage Tracking folder and click on Copy button in the upper left of the screen.


Go to the destination presentation catalog. Be sure the shared folder is opened. Right click in the folder window and click on Paste. Close both catalog managers.

Note: Some people are worried that they don't see the entry directly in the reports. This is because OBIEE saves the usage tracking inserts them in a batch. The amount is influenced by the BUFFER_SIZE parameter. This specifies the amount of memory used to temporarily store insert statements. The buffer allows the insert statements to be issued to the usage tracking table independently of the query that produced the statistics to be inserted. When the buffer fills up, then subsequent queries’ statistics are discarded until the insert threads service the buffer entries. When you stop the BI-server it will try to issue an insert all remaining entries.

Sunday, June 10, 2012

Rank Function

look at creating a rank measure in this post. We will use the sales_fact_1997 table for this example. Join the sales_fact_1997 table with the product, store, time_by_day and promotion table in the physical layer. Drag the sales_fact_1997 and product table to the logical layer and create appropriate joins.
We will create a salesRank logical column in the sales_fact_1997 logical table. Follow this steps to create the column.
1. Right click on the logical table and select new logical column. Enter the name of the column as SalesRank and select on the check box that says 'use existing logical column as source'. use the expression builder to build the rank expression as shown


Create a sales table in the presentation catalog and add the salesrank unit_sales and the product name columns.

Open Answers and create a new request. Drag the columns from the sales table to the request.

This creates a report of products with ranked sales. To select the top 10 use a filter.

Logging Level

Logging Level:-

The login level is a parameter which control the number of information that you will retrieve in the log file from nothing (level 0 - no_log_found) to a lot of information (level 5).

You can enable logging level for individual users, you cannot configure a logging level for a group.

In normal operations :

    users have a logging level set to 0
    administrator have a logging level set to 2

Logging Levels     Logging Level Information That Is Logged
Level 0     No logging
Level 1     Logs the SQL statement issued from the client application
Logs elapsed times for query compilation, query execution, query cache processing, and back-end database processing
Logs the query status (success, failure, termination, or timeout). Logs the user ID, session ID, and request ID for each query
Level 2     Logs everything logged in Level 1
Additionally, for each query, logs the repository name, business model name, presentation catalog (called Subject Area in Answers) name, SQL for the queries issued against physical databases, queries issued against the cache, number of rows returned from each query against a physical database and from queries issued against the cache, and the number of rows returned to the client application
Level 3     Logs everything logged in Level 2
Additionally, adds a log entry for the logical query plan, when a query that was supposed to seed the cache was not inserted into the cache, when existing cache entries are purged to make room for the current query, and when the attempt to update the exact match hit detector fails
Level 4     Logs everything logged in Level 3
Additionally, logs the query execution plan.
Level 5     Logs everything logged in Level 4
Additionally, logs intermediate row counts at various points in the execution plan.




Event Polling Table

Event Polling Table to Purge the cache in OBIEE


An event polling table is a way to notify the Oracle BI Server that one or more physical tables have been updated. Each row that is added to an event table describes a single update event. The cache system reads rows from, or polls, the event table, extracts the physical table information from the rows, and purges cache entries that reference those physical tables.
Generaly we will purge the cache in two ways
Manage – Cache
From dashboard also we can purge the cache
Settings – Issue SQL — Call Sapurgeallcache()
Instead doing like this on regular interval with the help of Event table we can purge the cache.
Procedure :
Create table S_NQ_EPT
(
UPDATE_TYPE INTEGER DEFAULT 1 NOT NULL,
UPDATE_TIME DATE DEFAULT SYSDATE NOT NULL,
DB_NAME VARCHAR2(40),
CATALOG_NAME VARCHAR2(40),
SCHEMA_NAME VARCHAR2(40),
TABLE_NAME VARCHAR2(40) NOT NULL,
OTHER VARCHAR2(80) DEFAULT NULL
)
Once the table is created we have to import the table into Physical layer.
Goto Tools — Utilities
Click OracleBI Event Tables and Execute
Will find the list of tables.Select the S_NQ_EPT as a Event table and also set the Polling frequency Time.
Once we selected the event table we can able to see that in the physical layer as below
Now create a sample report based on Dept Dimension to create the cache
Check the Cache Folder
Once this is done we need to insert data into the S_NQ_EPT table
After the Polling frequency time we mentioned just check the Cache folder it will be Purged

Tuesday, June 5, 2012

OBIEE Variables

Variables in the Oracle BI Server can be either repository or session variables, and are defined using the Variable Manager within the Oracle BI Administrator (Manage > Variables).
Variable1
This brings up the Variable Manager, that divides variables out into repository and session variables, with a further subdivision into static and dynamic repository ones and system and non-system session ones. You can also define variables at the Presentation Server level, these are scoped to individual dashboards and are used amongst other things for setting dashboard prompt values and integrating the dashboard with BI Publisher parameters. Here’s a screenshot of a typical Variable Manager screen.
Variable2
So what are server (repository) variables used for? Well if you’ve done any programming, either in languages like Visual Basic or on the web, variables these platforms use are the same as the ones that the BI Server uses. As with web development, you can either have server variables that hold the same value across all sessions (in BI Server terms, a “repository” variable), or variables that hold a value specific to each user session (“session” variables in the BI Server). Furthermore, repository variables can either be constants (“static repository” variables) or can have values that vary over time (“dynamic repository” variables), with the dynamic ones refreshed to a schedule via SQL calls and session variables usually set at the time of logon by reference to an LDAP server or an SQL statement. So now that’s all clear, in what sort of situation would they be used? Here’s a scenario that makes use of all these types of BI Server variable.
We have a reporting requirement where salespeople are responsible for a number of states, and these states change on a constant basis. Some of these states are shared with other salespeople and it’s not really practical to define set groupings of these states, hence we have a lookup table on our database that lists out the states each salesperson is responsible for, together with a marker against one particular state that is their home state. We want to use these lists of states as dynamic filters on the reports our salespeople run, and have reports default to their home state when they view their data in a dashboard. We also want to only show data for the current month in their reports, and hold the state in which our headquarters is based in a constant so that we can highlight the sales that are made near to it. In other words, we’ve got a good example of where variables in various shapes and forms can be used to make this process a whole lot easier to implement.
To start off, I use the Variable Manager to define a new static repository variable that I call HEAD_OFFICE_STATE and set to the value ‘NY’. This initial value, that stays constant for static repository variables, is called the “default initializer” and can either be typed in or set via an expression. The only way this variable value can change is if I go back into the Variable Manager and change it there.
Variable3
Next I create the second repository variable, this time a dynamic one, that is used to return the current month name and year to any query that requires it. Unlike static repository variables this variable’s values change over time, using an SQL statement executed to a schedule to update its values.
Variable4
As well as having a default initializer, dynamic variables get their values from initialization blocks that provide the mechanism to execute the SQL via a schedule. To define the initialization block I press “New” and enter the SQL, refresh frequency and variable target for the block. Notice in this example that I’m running the SQL against an Oracle database connection pool, and the truncated month and year name is generated through some Oracle function names and a reference to sysdate.
Variable5
So now I’ve got my two repository variables defined. Before I create my my session variable, I first have to locate a table in my database that lists out the selection of states that each salesperson is working with. Notice how there’s sometimes more than one row per salesperson.
Variable6
I now move on to creating the session variable. To use the data from the table above to provide the values for my dynamic filter, I first of all make sure that this table is accessible through a connection pool in the BI Server physical layer, and then go back to the Variable Manager to define my session variable. Now if this variable was going to hold a single, scalar value, I could define it as a regular non-system session variable, but as it need to hold data from more than one database row, I instead define it using an Initialization Block and row-wise variable initialization, which defines one or more variables based on a select statement, dynamically creating the variables as needed and concatenating the results of multiple rows into the variable.
To perform this task I first define a the new Initialization Block and call it STATE_SECURITY. I then define my SELECT statement as the following:
select 'STATE_FILTER',state from variable_example.salesperson
where salesperson_name = ':USER'
This will return one or more rows with STATE_FILTER, the new dynamically-created variable name, as the first column, and the set of allowed salesman states as the second value, filtered on the logged-in user ID. My initialization block now looks like this:
Variable7
To assign the results of this SELECT statement to a variable, the STATE_FILTER variable, I then click on the “Edit Data Target” button and select Row-Wise Initialization, rather than try and assign the results to individual variables. The initialization block will then take all the rows that have STATE_FILTER as the first column and create a single concatenated, comma-separated list out of the second column, so that I can then apply this value to a filter.
Variable8
Now that the session variable definition is complete, I move over to the Security Manager application, create a group for all of my report users and then define a filter for that group against the SALES table, the one I want to restrict access to. The filter references the session variable I just created, using an equality (“=”) operator rather than the “in” operator you’d have expected, this is a requirement for row-wise variables and OBIEE handles the translation properly in the background.
Variable9
Now if one of the affected users logs in an runs a report against that table, the results are filtered down without any intervention on their part.
Variable10
Conditionally formatting the State column based on whether the value each row contains is also fairly straightforward. When you create a conditional format condition you can specify that that value tested against is a variable; to access one of the repository variables you put biServer.variables[''] around the variable name so that, in my example, the variable name becomes biServer.variables['HEAD_OFFICE_STATE'].
Variable11
Displaying the report now shows all instances of “NY” highlighted in red, based on my conditional formatting rule and the value of the HOME_OFFICE_STATE static repository variable.
Variable12
So there you have it. Moving on from here, filtering the report again based on the dynamic repository variable is just a case of referencing the CURRENT_MONTH variable in my filter, and adding another session variable to hold the salesperson’s home state involves creating another initialization block that this time provides a value for a regular (i.e. not row-wise) HOME_STATE session variable.

Monday, June 4, 2012

Hiding Stuff in OBIEE Based on User Profiles

Hiding Stuff in OBIEE Based on User Profiles

Having seen so many ways on how you can secure your presentation layer based on logged in user profile, I thought this would be a nice place to consolidate the list together and give you step by step guides on these implementations.
Ways of hiding stuff .. one way I call it..  are
1. Column Level Security
2. Row Level Security
3. Hiding certain sections of  the dashboard
4. Hiding certain tabs of the dashboard
How do we implement each of these techniques and when is an interesting concept.  Lets delve in further.
For demo purposes, I am using the Famous free paint dashboard and creating two users into the rpd.
One user is adminTest, who can see all the pieces of the dashboard and is member of Administrator group. Other is executiveTest, who can see only certain pieces of the dashboard and each of them being implemented using above hiding scenarios explained.
Open rpd.. Go to Manage –> Security –> Security Manager pops up as below. To add a new user in Security Manager, go to  Action –> New –> User and type in the new user name and password.
SecurityManager
Column Level Security:
Let’s say you want to hide all dollar amounts from “executiveTest” user of Paint Subject area.  Open up your rpd, go to presentation layer, choose “Paint” subject area, choose “Sales Measures” and columns corresponding to $ as shown below (all selected).
SelectedMeasures
Lets start with “Dollars” column. Right Click on it, Choose Properties and then click on “Permissions” button as hsown below.
ColumnPermissions
When you click “Permissions”, a new window pops up.  Check “All Users” button on the top and Uncheck “EveryOne”  “Read” access.
ColumnSecurity_Checked
See the “Above ” picture. Everyone except “executiveTest” user and its group have the access rights on this column. Click “OK”. Remeber that if there is no red cross mark or the checkbox is unchecked, it means its disabled.
Lets repeat the same for all the column that we want to hide from this user.  Now Click “Ctrl +K” for doing a consistency check. Or Click “Ctrl + E” for opening up the consistency checker window and click “Check All Objects”.  Also, save the repository once the consistency checker has come up with no errors.
Restart your services once the repository is saved. Go to the presentation layer as “Administrator” and create an answers query using “Paint” subject area and the dollars columns as shown below.
AdminAccount
AnswersScreen
Save the report. See the results as shown below.
adminTestAnswersAllColumns
Now Login using the “adminTest” account and you will see the same report with all columns visible to the user. When you select “Paint” subject area, and in the left pane, you will see all columns (including columns refering to $).
Now Login using “executiveTest” and see at the left pane of “Paint” subject area. All the columns referring to $ have been invisible.
executiveTestAnswersInvisible
Also, when you try to open the previously saved report with “executiveTest” user name, the answers throws in an error.
executiveTestAnswersErrors
Now to fix this error, go to c:\Oracle BI\Server\Config\NQSConfig.ini file
Look for the word “POPULATE_AGGREGATE_ROLLUP_HITS” inside this file. The default value for this parameter is “NO”. Change that to “YES”. Save the file. Now restart your services.
Go back and login as “executiveTest” and open up the same answers report we saved before.  This time, the report does not show up any errors.
executiveTestAnswersVisibleAfterNQConfigChanges
Also, see that all the $ columns we had before are invisible and the answers work perfectly.
Now, lets see what happens if  ”executiveTest”  is given exclusive access even if the ”Executive” group access has been disabled as follows or vice versa.
DifferingSecurity1
DifferingSecurity2
In both these cases, there is no effect of security and the answers report works fine as if any other user has logged in.
In essence, to implement column level security, the user and its group both should be restricted access to that column. Lets think of this in the real implementation. All projects that I worked on always have users being part of some group rather than an individual.