ODI Executions Parametrization – Pros and Cons

mature man using laptop while discussing information with colleague

The Oracle Data Integrator scenarios and load plans provides a range of flexibilities when it comes to the executions. We can select the Context, Log Levels, provide values as input parameters as required, and all these add to the versatility of this tool. But it also comes with its own advantages and disadvantages. There are various factors that are in our favor and can help us a lot when we use them the right way, on the other hand it also can lead us to a challenging path which we should learn to tread carefully if taken.

ODI Executions Parametrization - Pros and Cons
ODI Executions Parametrization - Pros and Cons

Pros of Parametrization

Flexibility - At the heart of the power of ODI is its adaptability and flexibility. And during execution of scenarios and load plans, we can make maximum use of it. We can use variables in packages and procedures and pass the values of these declare variables while executing the scenarios. These scenarios when used in load plans provides us the same variables as prompts to pass values to. Thus, we can use the same code to execute for different variable values, giving us a range of options and dynamism while running it.

Durability - When we have codes that can run to meet different scenarios, it has higher chances of meeting changing business requirements in the long run. Thus, it will be more durable than a static version of a code built for a specific scenario. So it is often in the interest of the changing business ideas that it is preferred to keep things flexible in hand so that maximum known cases can be covered by a single piece of code which takes inputs from the user.

Integrability - Codes that can be parameterized to handle multiple scenarios often are found useful to do something that can be integrated as a part of another executable scenario or load plan, or maybe integrate with itself. Meaning, we can integrate the same scenario multiple times in a load plan with different set of parameters. Thus using a single component, we are able to accommodate multiple requirements by re-using it more than once in one load plan. The load plan when executed successfully will give us the outputs (say files or data loaded to different tables) for both sets of parameters that were used.

Maintainability - The crucial factor that determines the importance of a parameterized code is its maintainability. The code might involve a few more variables than it would have needed to be built in the static way. But if we look at the bigger picture, it's definitely worth it. The number of scenarios that the code can handle is equivalent to the number of static codes we did not need to build, thus saving a bulk of the development effort and keeping the Production instance less cluttered. It provides a single starting trigger point for all the different scenarios it can cover.

Flow Control - The parameters that we use in our codes determines the path the job will take while executing. For each different set of parameters, the path can be different and provide us with a range of options. Thus, we can have finer control and ability to decide the path we want the job to take as per the situation and requirement at that moment. This level of control would not have been possible without the use of parameters, and often proves to be extremely useful and handy in situations when we quickly might need the code to do something that is rarely requested.

Cons of Parametrization

With great power, comes great responsibility. Yes, with the highly flexible code we have in our hands, we need to make absolutely sure it will never fail. This calls for huge rounds of testing, even for test cases you might feel irrelevant.

Branch Testing - When we have a code version that accepts multiple parameters, we have an array of paths that the same code might traverse for each combination of the parameters. To make sure the code is fully correct, exhaustive testing needs to be done to make sure that each path is behaving as expected and the results are meeting the expectations.  This is a must even if it asks for more testing effort, to have confidence the code will act as expected even when provided with lesser used parameter combinations.

Boundary Value Testing - One of the usually faced scenarios in ODI parameters is giving a range of values as parameters, maybe a start_value and end_value of some dimension, say time. In such cases, it becomes crucial to check whether the end points (in our case, months) are working as expected. The first and last month in the range should work fine as the other months. If user mistakenly enters reversed parameter values (start_val in place of end_val and end_val in place of start_val), that should work as well.

Decision Coverage Testing - When we use parameters in an Oracle Data Integrator package, no doubt we are using variables, and lots of variables to make sure we are able to handle all our scenarios. This calls for several true false paths each of which needs extensive testing to make sure we do not inadvertently fall into a trap which was never seen by us, worse that untested code path can end up doing something nasty (say data removal, which might call for re-running the loading processes) that can take several hours to fix.

Fuzz Testing - Even after extensively doing Branch Testing and Decision Coverage Testing, it is important to view from the perspective of a user and try running the parameterized code with random combination of values in an effort to see if all such combinations have been part of already tested scenarios and do not do something that we would not expect it to. It is necessary to document each and every scenario captured, because in future you do not want to suddenly wake up at night thinking if something was tested or not, right?

Thus, we can create something really useful, but at the same time be cautious that we do not hurt ourselves with something so much powerful that it starts overshadowing us. It's true machine and AI is the future, but we do want to make them flawless to avoid something like we saw in Isaac Asimov's fictions!

Author: Rahul Bhattacharya

Rahul is a journalist with expertise in researching a variety of topics and writing engaging contents. He is also a data analyst and an expert in visualizing business scenarios using data science. Rahul is skilled in a number of programming languages and data analysis tools. When he is not busy writing, Rahul can be found somewhere in the Appalachian trails or in an ethnic restaurant in Chicago.

Leave a Reply