Managing BI Services using ODI

woman in black tank top writing on white paper

Oracle Data Integrator is one of the widely used tools for data loading into the Oracle Business Intelligence data warehouse. Often it happens that due to frequent data updates in the source systems, the data needs to be refreshed in the data warehouse during the daytime, while users might still be trying to run reports for their reporting purpose. This no doubt calls for a mechanism to restrict the user data access during the refresh activity to avoid dirty reads.

Managing BI Services using ODI
Managing BI Services using ODI

We will be using the WebLogic Scripting Tool to achieve our objective. At the start of the load plan that loads the data from source to target, we will add a step that will bring down the services so that BI cannot be accessed by the users. Again at the end of the load plan we will add a step to bring the services up. This will also help us ensure that if the load fails the services will continue to stay down.

To Stop:

We will be creating an ODI OS command with the below script that will allow us to stop the BI services:

run.sh stop

This script will call the script wlst.sh with the variable value as stop. The content of run.sh will be:

$MW_HOME/oracle_common/common/bin/wlst.sh mydeploy.py stop

The parameters required for wlst.sh can be passed and set via other supporting (usually ".py") files. Download the Weblogic Scripting Tool Guide at the end of this article.

To Start:

We will be creating an ODI OS command with the below script that will allow us to start the BI services:

run.sh start

This script will call the script wlst.sh with the variable value as start. The content of run.sh will be:

$MW_HOME/oracle_common/common/bin/wlst.sh mydeploy.py start

The values for start and stop are passed over from run.sh. Using the above variables for stop, start and Middleware home further means we have an increased flexibility and these variables can be refreshed using ODI variables in the package from a database control table. Thus this code will run fine in every environment without requiring any change, since the values are already set specific to each instance in our database control table with the appropriate values.

Another advantage of having the start and stop mechanism through ODI is utilizing the daily executing statistics to predict the completion time of the job. Usually the execution time tends to vary over a period of time and often shows a pattern, i.e. the job might take longer on specific days of the months, or might be increasing by a few minutes every week. These can be considered in a ETC (estimated time of completion) calculator procedure that can simply calculate the expected end time of the job using our custom algorithm. This information is often extremely handy in crunch situations and puts the static history data to good use for all. There goes some small AI stuff again!

Subscribe and Download the Weblogic Scripting Tool Guide below to learn more on how to configure and customize the parameters in your script.

Subscribe for $12 per year. Get live stories and updates.

Already a member? Log in.

  • Premium Articles & Downloads
  • In-depth comprehensive analysis
  • Monthly price of only a movie ticket
  • 3000+ Practice Questions included
  • Advanced Aptitude & Coding Questions
  • Real-time feedback in every question
  • Continuously updated database
  • Wide coverage of industry news
  • Career & IT Consultation, Contact Me

Thanks for reading
Insight Crunch.

Create your account or
log in to continue reading.


  • Join thousands of subscribers
  • Get instant access to this article
  • Silicon Valley latest events coverage
  • Unlimited access to Premium Articles
  • Wide coverage of industry news
  • Mock Tests for Strategic Preparation
  • 3000+ problems on variety of topics
  • Real-time feedback in every question

Support independent journalism.
Already a member? Login.

How do you automate your BI service related activities and downtime maintenance?

Author: Rahul Bhattacharya

Rahul is a journalist with expertise in researching a variety of topics and writing engaging contents. He is also a data analyst and an expert in visualizing business scenarios using data science. Rahul is skilled in a number of programming languages and data analysis tools. When he is not busy writing, Rahul can be found somewhere in the Appalachian trails or in an ethnic restaurant in Chicago.

Leave a Reply