Oracle BI EE 10.1.3.4 – Calling BI EE WSDL from BI Publisher

Another very important and very good feature of BI Publisher 10.1.3.4 is the support for Complex Web Services. Support for Web Services in BI Publisher,i believe, was introduced in 10.1.3.2. But the major problem with that was the support was limited to very simple web services like Yahoo Stock quotes etc. But now with the support for complex web services, one can achieve more complex integration with BI EE as well as other reporting toolsets. One direct correlation of this support is the ability of BI Publisher now to have BI EE WSDL as a data source. Yes, BI Publisher now can call BI EE WSDL’s directly. Till the previous release, the only way to achieve this was by converting the complex WSDL to a simple one through servlets. Lets try this new feature out today by means of a simple report. We shall start with a new report with Web Service as the data source.
     
When we choose Web Service as our data source, we can now see there are 2 types of Web Services. Simple or Complex.
     
Lets choose complex. And in the WSDL URL lets enter the BI EE WSDL.
1 http://localhost:9704/analytics/saw.dll?WSDL
As soon as we enter the above WSDL, we should now be able to choose any one of the web services. Remember BI EE WSDL is a multi binding web service. That is, a single WSDL can provide a combination of multiple web services. For now lets choose SAWSession Service.
     
Lets choose the Logon method and enter the username and password parameters.
     
Now, lets see what happens when we look at the report output.
     
Strange. I was expecting the Web Service to return back the SessionID. But unfortunately it does not look to be straightforward. So far, i have not been able to get this to work (even when i try to pass the parameters from BIP parameters using ${Parameter_name}). Now, if you see the OC4J command window, you would see the actual SOAP envelope that has been fired by BI Publisher.
     
Now, lets copy the above SOAP envelope and fire it from SOAP UI to test whether this envelope actually works.
     
Well, it does work. So, I am not sure what else is required to get the login method to work from within BI Publisher. I believe the above behavior is probably due to the fact that BI Publisher actually tried to fire the SOAP envelope using a dummy parameter first and then with the actual parameters i,e your OC4J would actually show you the 2 SOAP envelopes being fired from BI Publisher
01 08/08/12 20:04:45 WebServiceCall::callComplexClient SOAP Message = <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
02       <soap:Body xmlns:ns1="com.siebel.analytics.web/soap/v5">
03             <ns1:logon>
04                   <ns1:name>%LABEL_1%</ns1:name>
05                   <ns1:password>%LABEL_2%</ns1:password>
06         </ns1:logon>
07     </soap:Body>
08 </soap:Envelope>
09 08/08/12 20:04:45 After WSS soapMessage = <soap:Envelope xmlns:soap="http://schemas.xmlsoap.org/soap/envelope/">
10       <soap:Body xmlns:ns1="com.siebel.analytics.web/soap/v5">
11             <ns1:logon>
12                   <ns1:name>Administrator</ns1:name>
13                   <ns1:password>Administrator</ns1:password>
14         </ns1:logon>
15     </soap:Body>
The authentication failed message that we obtained is probably due to the first envelope. Not sure what more is required to get this to work. A very interesting feature but something i have not been able to test successfully so far(atleast with BI EE alone).

Oracle BI EE 10.1.3.4 – Switching between Production and Development Data Sources using same Reports and Dashboards

I was recently involved in a client project wherein the client was implementing BI EE on top of an Oracle Database. Since they were on a shoe-string budget, they could not afford to have a couple of servers for BI EE Development and Production. Of course, not an ideal situation, they did have a couple of instances of the database (one for development and the other for production). These databases were populated using Informatica from an external source. Since, the client had only one instance of BI EE, they wanted to know whether it is possible to switch between production and development databases just by means of a single prompt (for administrators alone to test the data validity across the database instances). Though a very unique request, it made sense due to the frequency of development happening on the dev databases. Also, they wanted a capability wherein a set of users would be looking at a set of reports reporting against the dev database and another set of users would be reporting on the same set of reports on the production database. The below diagram provides the details on what is required.
        
Lets look at an approach today using which we shall achieve the above requirement. So, in effect our aim is to have a prompt across all the reports, which will switch between the development and production databases (for Administrators). For demonstration purposes, i would be using the Global Schema. Lets consider the Global Schema as the actual development database. Also, lets consider another schema called global_prod which would have the production database tables.
        
As you see, we have the same set of tables in both prod and dev databases. Typically any Administrator would like to run the reports on both dev and prod to determine any possible issues. So, lets start with designing the BMM layer for these 2 schemas. The idea is to create a single BMM layer which will fetch data from either prod or dev depending on the fragmentation set in the repository. To start with, lets create a simple select view in both Dev and Prod to return the actual values that we would choosing in the prompts for Dev and Prod.
1 SELECT 'Dev' FROM DUAL
1 SELECT 'Prod' FROM DUAL
        
Now, for each of these new sources, create a complex join with the fact tables as shown below
        
Now, in the BMM that we are building, add a custom dimension with a single column called Source using both the above custom select views as Logical Table Sources. Also, set the fragmentation in such a way that one LTS is chosen, when Dev is chosen in the prompt and the other LTS is chosen when Prod is chosen in the prompt as shown below.
        
Now, for all the dimensions and the fact tables, include both the Prod and dev sources. Ensure that fragmentation is set for each one of them to Prod or Dev depending on the LTS. For example, the fact table related Dev LTS is shown below
        
I have not shown the dimension build and other aspects in the RPD BMM as everything else would remain the same. Instead of one LTS for each dimension, use 2, one for prod and the other for dev. Of course, as your RPD becomes complex, adding more and more LTS would only make the RPD to grow big. That is something one would have to consider. But this approach has very good flexibility and can be implemented by anyone who wants implement a similar requirement.
Now lets log into Answers and check how this works. Create a very simple report as shown below and have the Dimension Source column as Is Prompted in your report. Also create a prompt to show the Source values of Prod and Dev. Include the report and the prompt in the dashboard. Lets choose Dev first and run the report.
        
If you look at the query, you would notice that Dev database is used for generating the above report.
        
The same would be the case for Prod as well. But one drawback is that, if we do not include this filter, we would get a union all from both the LTS as shown below.
        
        
One cannot expect every user to create a prompt like the one above in each and every report. So, in order to avoid the duplicate issue and to ensure that a group of users automatically have a Dev or Prod filter assigned to them just add a custom Security Group and add the filter as shown to each and every Logical Table in the BMM. One Security Group would have a filter on Dev and the Other on Prod. So, anytime a specific user from one of the above groups creates a report, the Dev or Prod filter is automatically passed and this will ensure that the fragmentation happens properly.
        
        

Renaming Dashboards in OBIEE

Renaming Dashboards in OBIEE

Renaming Dashboards in OBIEE

There are 2 ways to rename the shared dashboards.

  1. Using Catalog Manager
    2. From Answers admin tab.

Using Catalog Manager:

  1. Log in to catalog manager and navigate on shared folders and select the Group Folder under which you have created that dashboard.
    2. Navigate till you find _Portal folder under the shared folders
  2. _portal-path
  3. Select a dashboard that you want rename from the right side pane, right click and select Smart Rename (fix refs) option
  4. renaming-dashboard
  5. Here you give your desired name for the selected dashboard.

I am renaming it to Rename Test

rename-dashboard

  1. Press Enter to submit the name to server
  2. Now go to login to presentation service and go to dashboards. You will see that your dashboard name has been changed.
  3. verify-dashboard-name

Rename dashboard from Answers admin tab:

  1. Log into presentation service with Administrator, go to Settings > Administration > Manage Presentation Catalog.
  2. It opens a window; in this window you find option Show Hidden Items. Select this check box. Now you are able to
  3. Navigate to the Shared Folders and the group folder under which you have created the dashboard.
  4. Select _Portal folder (this will only appear when you select Show Hidden Items options)
  5. It displays the list of dashboards under this folder. Now select Rename option corresponding to the dashboard that you want to rename
  6. manage-catalog
  1. Here you mention the desired name and click on Update.
  2. update-name
  3. Now go to Dashboards and check whether name has renamed or not.

Setting LOGLEVEL from Answers in obiee11g

Normally, we check query in NQQuery.log file or from Answers also we can check the query directly.

Difference is: NQQuery.log contains all the queries and need to identify them by time the report run.

If it’s directly from Answers (go to Settings > Administration > Manage Sessions), log available for every report runs until the query is changed.

For every query of report there will be created one cursor, by accessing the cursor you can see the physical query generated by BI server.

If we are unable to see the log file or if we find any error message saying that, No Log found while trying to watch the query in log the first step in the mind is to check the LOGLEVEL. When creating rpd by default, loglevel takes 0 values for Administrator user.

If it’s set to 0, we go for 2 or 3 accordingly, such that, the query will be appeared.

For suppose, if you are accessing presentation service of another system and unable to see the query. You need not to set the LOGLEVEL=2 in the rpd presented in another system.

I am assuming that all cursors are closed and LOGLEVEL is zero for particular user.

Log into answers with user who has LOGLEVEL =0, run any report then go to Settings > Administration > Manage Sessions. Click on View Log option.

You will have error message or you may open the view log but you can’t find the query run at this moment because of loglevel – “0″. (To test this error message, you better delete the content of NQQuery.log completely but before doing this you need to stop your BI Server service then clear the content and save file again start BI Server service)

Close the window as well as cursors

Just go to advanced tab of particular report.

Scroll down to see the option: Prefix

Here write SET VARIABLE LOGLEVEL=2; (semi-colon at the end is must)

 

Now click on Results tab to re – run the report.

Now you go Settings > Administration > Manage Sessions

Click on View Log to view the query. Now you are able to see the query and according to the LOGLEVEL – 2

Go back to Advanced tab, observe the Logical query.

Before SQL, you find the text you written in Prefix field.

By mentioning the semi colon, BI Server executes these statements one after another.

This post explain you, how to set loglevel from answers it self, with out going to rpd,

This process overrides the loglevel set in the rpd.