Searching for Doc Id 1428008.1 on MOS

If you are searching online for information about multiple Authentication Providers in Oracle BI 11g, you could find references (even on my own blog) to Doc Id; 1428008.1 on My Oracle Support (MOS). Via this note you were able to obtain following: TechNote_LDAP_Auth_DB_Groups_V3.pdf.

If you try to find this note now, you won’t be able to find it. It seems that the note has been closed. You should be able to find the information you are looking for here (Configuring LDAP as the Authentication Provider and Storing Groups In a Database)


Set up https (SSL) for Weblogic and OBIEE

I have been blogging about the integration between Oracle eBS and Oracle BI EE. Apart from the integration, there are are few assumptions:

  • Oracle eBS is installed
  • Oracle BI is installed
  • Oracle eBS and Oracle BI are compatible with each other (http vs. https)
  • All necessary Oracle eBS patches are installed
    • R11 check
    • R12 included
  • The Web Browser should be able to accept cookies
  • The ICX session cookie name is case-sensitive
  • Oracle eBS and Oracle BI should be installed into the same domain (machine1.domain.ext = machine2.domain.ext)

At one of our clients we were confronted with the fact that in a new environment, Oracle eBS runs on https while our Oracle BI environment was still http. This conflicts with one of the assumptions above.

This blog entry is inspired by: Debashis Paul by guest authors Menno Harzing and Rob Chou. We have added cluster-configuration and changed the numbering.

These steps are followed to protect your data-transport from/to OBIEE via the internet.
There are two parts described below to accomplish this:

Part One – Configuration under Weblogic Console
Part Two – Configuration under OFMW Enterprise Manager

Part One – Configuration under Weblogic Console

  1. Login to Weblogic Administration Console.
  2. Click on Environments -) Servers -) AdminServer (admin) -) General tab
  3. Click Lock and Edit from the left pane.
  4. Check the ‘SSL Listen Port Enabled’ as 7022
    (this is not the default SSL port, so please check yours and modify based on that)
    This will ensure that you will be able to access the URL using 7022 port using https://
  5. Check also ‘Listen Port Enabled’ if you also want to access BI URL using http://
  6. Save the configuration
    the location of the resulting file is found at /u01/app/oracle/product/fmw/user_projects/domains/<DOMAIN_NAME>/config/config.xml
  7. Activate the changes from left pane
  8. Change BIEE_MANAGER_URL in
    and ADMIN_URL in
    from t3://…PORT (e.g. 7001) to https://….:SSL-PORT (e.g. 7002)
  9. Restart the Weblogic Servers(Admin/Managed) and BI Servers components
  10. Accept the exception in browser when it prompts for it and continue accessing BI URL in secure HTTPS protocol(Note that once this has been made as https:// you have to access OFWM EM Control page and Weblogic Console page also in https:// going forward)

    Part Two – Configuration under OFMW Enterprise Manager

  11. Navigate to “<OFMW Home>\user_projects\domains\bifoundation_domain\bin” and take backup of startManagedWebLogic.cmd
  12. Edit and locate section with below content (on 1 line):JAVA_OPTIONS=””/u01/app/oracle/product/fmw/wlserver_10.3/server/lib/cacerts” ${JAVA_OPTIONS}”
  13. Replace the above with below: (Kindly note that you have to change the OFMW Home path as applicable to your environment)JAVA_OPTIONS=””/u01/app/oracle/product/fmw/wlserver_10.3/server/lib/DemoTrust.jks””
  14. Restart all the services of Weblogic (Admin/Managed/opmnctl/Node Manager/Process Manager)
  15. Log in to OFMW Enterprise managerIn the next steps via the System MBean browser SSL across all BI components will be configured
  16. Open System MBean Browser
  17. Invoke the Lock of BIDomain.
  18. Now we have to generate the certificates required as a prerequisite for enabling SSL,
    using the specified passphrase to protect both certificate stores and private keys.
    This enables internal https calls to the web server.
    The certificate type (pem or der) must be explicitly stated.Navigate to oracle.biee.admin –> bifoundation_domain –> BIDomain.BIInstance.SecurityConfiguration
    click on the BIDomain.BIInstance.SecurityConfiguration MBean.
    Click on the operation tab click on “generateSSLCertificates”.
  19. Enter the details asked for: For my case I have included below:
    Passphrase : ><change_password><
    webServerCACertificatePath : /wlserver_10.3/server/lib/CertGenCA.der
    certificateEncoding is: der
  20. Now click on Invoke
  21. Return to the path specified in step 17
  22. Click on simpleCommit (two items below lock).
  23. Repeat step 17 to lock
  24. Enable SSL for BI_SERVER1 on Weblogic Console (the same way as part 1, step 5)
  25. perform step 22 for simpleCommit.
  26. Restart all the services of Weblogic (Admin/Managed/opmnctl/Node Manager/Process Manager)
  27. Go to Domain Structure – Environment – Clusters
  28. Click on Lock & Edit in top left pane
  29. Enable “Secured replication Enabled” for the cluster
  30. Click on Save at top or bottom
  31. Click on Activate Changes in top left pane
  32. Repeat step 17 to lock
  33. Click on attributes tab of the step 8
    Click on ‘SSLEnabled’ .
    Change the value to True
    Click on Apply
  34. perform step 22 for simpleCommit.
  35. Restart all the services of Weblogic (Admin/Managed/opmnctl/Node Manager/Process Manager)
  36. Return to Step 8 and click on “runSSLReport” ,
    Invoke it and find the output as below to ensure correct SSL communication across all BI components:

Thanks Menno Harzing and Rob Chou for this blogpost.

Big Data and Analytics Top Ten Trends for 2014

Oracle recently published their view on the; Top Ten Trends “Big Data” & “Analytics” for 2014. Find the list below:

1. Business Users Get Hooked on Mobile Analytics –> Oracle Business Intelligence Mobile App Designer

2. Analytics Take to the Cloud –> Oracle Applications Cloud

3. Hadoop-Based Data Reservoirs Unite with Data Warehouses –> Your Data Warehouse and Hadoop – Better Together Featuring Cloudera Webcast

4. New Skills Bolster Big Data Investments –> In search of Insight and Foresight – Getting more out of Big Data

5. Big Data Discovery is the Secret to Workforce Success for HCM –> Harnessing the Power of Employee Sentiment

6. Predictive Analytics Lend Fresh Insight into Big Data Strategies –> Oracle Advanced Analytics Option

7. Predictive Analytics Bring New Insight to Old Business Processes –> Big Data @ work - Real-Time Fraud Detection and Prevention

  • White Paper Big Data Analytics: Advanced Analytics in Oracle Database
  • White Paper Bringing R to the Enterprise
  • White Paper Oracle Engineered Systems – Engineered for Extreme Performance

8. Decision Optimization Technologies Enhance Human Intuition –> Oracle Real-Time Decisions Review by James Taylor

9. Business Leaders Embrace Packaged Analytics –> Packaged Analytic Applications: Accelerating Time and Value

10. New Skills Launch New Horizons of Analysis –> Oracle Training

If you are interested in; Succeeding with Enterprise Performance Management in 2014, please check here.

Check it out yourself

Check here for the Pre-Built Developer VMs (for Oracle VM VirtualBox). Pay special attention to the ‘SampleApp V309′ (Oracle BI Foundation) and the ‘BigDataLite 2.5′. If you are interested in Endeca, you might want to download an Endeca Virtual Machine here.

There are also some hosted environments online for Oracle BI (prodney / Admin123) and Endeca ( / Admin123)

If you would like some guidance, you might want to check out the Oracle Learning Library (OLL).

Global Currencies in Oracle BIA – The Sequel

I have blogged about Global Currencies in Oracle BIA recently. Global Currencies are used to report in different Currencies like USD, GBP, EUR. If you use the Global Currencies to report the same Currencies (eg. EUR) against different Exchange Rate (Types) you might end up with a challenge, because the Out-of-the-Box Oracle BIA ETL expects different Currency Codes.

If you take eg. the; ‘MPLT_CURCY_CONVERSION_RATES_ToGlobalCurrenciesOnly’-Mapplet, you will notice the;  ‘DOC_TO_GLOBAL2_EXCH_RATE_OUT’-Port. This Port is based on the; ‘DOC_TO_GLOBAL2_EXCH_RATE_VAR’-Port.

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES_ToGlobalCurrenciesOnly - DOC_TO_GLOBAL2_EXCH_RATE_VAR

If you further analyze the expression in the; ‘DOC_TO_GLOBAL2_EXCH_RATE_VAR’-Port, you can see that it’s made up of a few different checks.



2. If previous condition(s) not satisfied, then; If GLOBAL2_CURR_CODE = GLOBAL1_CURR_CODE then populate GLOBAL2_EXCH_RATE = GLOBAL1_EXCH_RATE

3. If previous condition(s) not satisfied, then; If GLOBAL2_CURR_CODE = DOC_CURR_CODE, then populate GLOBAL2_EXCH_RATE = 1.0

4. If previous condition(s) not satisfied, then; If DOC_CURR_CODE = ‘STAT’, then populate GLOBAL2_EXCH_RATE = 1.0

5. If previous condition(s) not satisfied, then; If GLOBAL2_CURR_CODE = LOC_CURR_CODE, then populate GLOBAL2_EXCH_RATE = DOC_TO_LOC_EXCHANGE_RATE_VAR else lookup on W_EXCH_RATE_G on the condition (DOC_CURR_CODE, GLOBAL2_CURR_CODE, EXCH_DT, GLOBAL2_RATE_TYPE_VAR, DATASOURCE_NUM_ID). The output of this Lookup is used to populate the; GLOBAL2_EXCH_RATE.

In the case of the same Global Currencies (eg. EUR) the execution of the above expression will stop at condition 2 and populate the GLOBAL2_EXCH_RATE-column with the value of GLOBAL1_EXCH_RATE. Although this behavior seems logical from an Oracle BIA perspective, it might not be satisfying as you cannot compare GLOBAL1_EXCH_RATE with GLOBAL2_EXCH_RATE.

In this case you will end up with a customization on the; the; ‘MPLT_CURCY_CONVERSION_RATES_ToGlobalCurrenciesOnly’-Mapplet. The easiest way is to copy the logic in the; DOC_TO_GLOBAL1_EXCH_RATE_VAR’-Port and apply it to the DOC_TO_GLOBAL2_EXCH_RATE_VAR’-Port. Of course you have to replace the references to; ‘GLOBAL1′ with; ‘GLOBAL2′.



Global Currencies in Oracle BIA

If you read the Oracle Documentation for Oracle BIA (, you could find the following; “Currency conversions are required because your business might have transactions involving multiple currencies. To create a meaningful report, you have to use a common currency.”

Oracle BIA stores amounts in the following currencies;

- Document Currency –> Currency of the actual Transaction (Can vary in a multinational organization)
- Local Currency –> Currency defined in the Ledger
- Global Currency –> Defined in the DAC

You can use a Global Currency if you want to report all the different Currencies in one Global Currency (eg. ‘EUR’, or ‘USD’). Oracle BIA for Oracle eBS is able to report in 3 different Global Currencies. If you use Oracle BIA for CRM, you are able to store an additional two Global Currencies. You define the Global Currencies in the DAC.


You will have to define a Global Rate Type for each Global Currency as well.

- $$GLOBAL1_RATE_TYPE (for the first global currency)
- $$GLOBAL2_RATE_TYPE (for the second global currency)
- $$GLOBAL3_RATE_TYPE (for the third global currency)

DAC - Currency - Rate - Parameters

You would use the above DAC Parameters for a Document Currency to Global Currency Conversion ($$GLOBALn_CURR_CODE & $$GLOBALn_RATE_TYPE). The $$DEFAULT_LOC_RATE_TYPE-Parameter is used for the Document Currency to Local Currency Conversion.

To get a better understanding about how these Conversions work, you should try to understand the; ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet. The purpose of this Mapplet is to; “Find the Conversion Rate for a given Currency Code, Rate Type and Exchange date. You should find this mapping in the Out-of-the-Box SILOS-Folder.

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - Diagram

In the ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet you will find an Expression; ‘EXPT_CALC_EXCH_RATES.’ This ‘EXPT_CALC_EXCH_RATES’-Expression is responsible for the actual Conversion (Calculation).


There are a few other Mapplets with similar name and functionality compared to the ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet.

The Conversion for eg. the GLOBAL1_EXCHANGE_RATE basically consists of three components:

- Input from a Mapping (eg. SIL_APInvoiceDistributionFact - W_AP_INV_DIST_F)
- Lookup to the W_GLOBAL_CURR_G-Table
- Lookup to the W_EXCH_RATE_G-Table

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - EXPT_CALC_EXCH_RATES - Calculate Rate

Input from a Mapping

The ‘MPLT_CURCY_CONVERSION_RATES’-Mapplet is part of a Mapping. Let’s take the ‘SIL_APInvoiceDistributionFact’-Mapping which populates the ‘W_AP_INV_DIST_F’-Table as an example.

Informatica Mapping - SIL_APInvoiceDistributionFact

Lookup to the W_GLOBAL_CURR_G-Table

The ‘SILGlobalCurrencyGeneral_Update’-Mapping in the ‘SILOS’-folder stores the values of the  to the $$GLOBALn_CURR_CODE-, and $$GLOBALn_RATE_TYPE-DAC-Parameters in the W_GLOBAL_CURR_G-Table.

W_GLOBAL_CURR_G - Currency - Rate

As you can see the values in the W_GLOBAL_CURR_G-Table, match with the values in the DAC-Parameters. Of course you can setup these Parameters anyway you like.

Lookup to the W_EXCH_RATE_G-Table

The W_EXCH_RATE_G is a Table which resembles the GL_DAILY_RATE-Table in Oracle eBS. If the contents of this Table are incomplete or incorrect, the whole setup of the different Currencies is useless. The ‘LKP_W_EXCH_RATE’-Lookup is used to find the Conversion Rate for a given Currency Code (‘From’ and ‘To’) and Exchange Date .

Informatica - Mapplet - MPLT_CURCY_CONVERSION_RATES - LKP_W_EXCH_RATE - Complete


If you query the  ‘W_AP_INV_DIST_F’-Table an you find the ‘GLOBAL1_EXCHANGE_RATE’-Column empty for a certain Transaction, the above should help you find out the reason.

Good Luck.

Oracle BIA – Making the Full / Incremental Load work

It’s possible to configure either a Full- or an Incremental Load in Oracle BIA. If you look at the Informatica version of Oracle BIA, there are a few areas you will have to configure.

First you start with the Informatica Mapping. This will be one Mapping. It does not matter whether you run this Mapping Full or Incremental.

Lets take the ‘SDE_ORA_GLJournals’-Mapping as an example. In the Source Qualifier of the Mapping (or Mapplet), you will see a reference to to the $$LAST_EXTRACT_DATE. If you would run the Mapping with these settings, you will run an Incremental Mapping. This means that you only select the data which is created / updated since the last ETL-run.

Informatica - Source Qualifier - $$LAST_EXTRACT_DATE

The $$LAST_EXTRACT_DATE is a Parameter which you configure in the Datawarehouse Administration Console (DAC) and reference in Informatica.


According to the Oracle documentation, the “@DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP. Returns the minimum of the task’s primary or auxiliary source tables last refresh timestamp, minus the prune minutes.”

Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$LAST_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a Parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.

So the steps are easy;

1. Declare the $$LAST_EXTRACT_DATE-Parameter in the DAC
2. Declare the $$LAST_EXTRACT_DATE-Parameter in Informatica
3. Reference the $$LAST_EXTRACT_DATE-Parameter in the Source Qualifier

As I said before, the same Mapping is used for the the Incremental- as well as the Full-Load. If you want to run the two different loads, make sure there ar two different Workflows which run the same mapping. The difference is in the mapping of the Workflow. The Full-Workflow uses the $$INITIAL_EXTRACT_DATE whereas the Incremental-Workflow uses the $$LAST_EXTRACT_DATE.

Informatica - Workflow - SDE_ORA_GLJournals

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals’), you will find the Source Qualifier with the extraction query and a reference to the $$LAST_EXTRACT_DATE-Parameter.

As you can see, the LAST_UPDATE_DATE is compared to the $$LAST_EXTRACT_DATE-Parameter.

After each ETL-run, the LAST_EXTRACT_DATES (Refresh Date) per table are stored. You can check, update or delete these values as per requirement (see picture below). If you decide to delete the Refresh Date, a Full Load ill be performed the next time.

DAC - Refresh Dates

As stated earlier, the Full-Workflow is almost identical. The only thing is that there is a reference to the $$INITIAL_EXTRACT_DATE. The $$INITIAL_EXTRACT_DATE-Parameter is defined in the DAC. You define a date in the past. Just make sure that this date captures all the data you need.


Make sure this Parameter is available in both the DAC (see above) as well as in the Mapping (or Mapplet).

Informatica - Variables and Parameters - $$INITIAL_EXTRACT_DATE

This way the Parameter can be used in the Extraction Mapping. If you reference a parameter in the Extraction Mapping Query which isn’t declared, the Workflow will return an error and won’t complete.

How do you make sure that the $$INITIAL_EXTRACT_DATE-Parameter will be used when running a Full-Load?

Informatica - Workflow - SDE_ORA_GLJournals_Full

If you edit the task which belongs to the Incremental-Workflow (‘SDE_ORA_GLJournals_Full’), you will find the Source Qualifier with the extraction query and a reference to the $$INITIAL_EXTRACT_DATE-Parameter.

As you can see, the LAST_UPDATE_DATE is compared to the $$INITIAL_EXTRACT_DATE-Parameter.

At this point everything is in place to either run a Full-, or an Incremental Load.

Informatica - Workflows

You just have to tell the DAC to either run the ‘SDE_ORA_GLJournals_Full’-Workflow or the ‘SDE_ORA_GLJournals’-Workflow (incremental)

DAC - Task - SDE_ORA_GL_Journals

Check the Informatica Session Log when the ETL has a another result than expected. It could be that the Workflows are incorrectly defined. You will see in the Session Log which Parameter is used and what the value of that Parameter is.

Good Luck.

Learn a lot about Oracle BI

I recently read a very interesting blogpost from Cary Millsap. Everybody who is wants to do a good job should find some time to read this. I think his recommendations do not necessarily only apply to the Oracle database. If you want to succeed in the field of Oracle BI, you could follow up the same recommendations.

Educate yourself 

Read, study, experiment. There is a lot to read about Oracle BI, both online as well as offline. Check the Addidici Blogger Database powered by Adrian Ward for a list of Oracle BI related blogs. I maintain a list on Twitter, with Oracle BI related profiles. If you would like to have some offline material check the following books;

Oracle is very open about their software. There is a lot of software which you can download, install and evaluate. You can check on OTN or in the ‘Oracle Oracle Software Delivery Cloud’ (eDelivery). Oracle made it very easy for you, because you can download complete installed and configured Virtual Machines. Check here for the Pre-Built Developer VMs (for Oracle VM VirtualBox). Pay special attention to the ‘SampleApp V309′ (Oracle BI Foundation) and the ‘BigDataLite 2.4.1′. If you are interested in Endeca, you might want to download an Endeca Virtual Machine here. The ORA Training Blog is providing Virtual Machines as well.

There are also some hosted environments online for Oracle BI (prodney / Admin123) and Endeca ( / Admin123)

If you would like some guidance, you might want to check out the Oracle Learning Library (OLL).


The (online) Oracle BI Community is very active. There are various (online) forums where you can meet and exchange knowledge. The easiest way to start building your network is online. Just start by posting and/or answering questions in the various online forums. I used to frequent the Oracle forums on OTN and Oracle Support Communities. These days I would rather go to; OBIEE Enterprise Methodology Group. There are also various groups on LinkedIn, which you can visit.

You could also try to start a blog e.g. with WordPress, like this one. Or drop your presentations on Slideshare.

For me it’s better to see people fact to face. If you want to meet like-minded people in person you should try to visit seminars and forums. My favorite is the yearly RittmanMead BI Forum. Check here for the 2014 edition.

Immerse yourself into some real problems

No further comments. The best way to learn is by doing.

No need to mention that the above is by no means complete, but it could give you a head start. Please feel free to comment.


Get every new post delivered to your Inbox.

Join 671 other followers

%d bloggers like this: