One of the best approach to load data into Oracle data mart using SSIS

I tried all the drivers MSDAORA, ORAOLEDB and .NET Providers to load from Sql Server to Oracle, but the load time didn't differ. Performance aggravated as we did not have any control of increasing the commit size on Oracle . I understood from Scott Barrett's and Sutha Thiru's blog that using Persistent's SSIS Oracle bulk Load connector gave an outstanding performance. But again the Client Organization standard didn't allow buying a third party tool for the sake of this.

So finally, I read about SQL* Loader and Eureka ...! how about dumping the required Sql Server Data into Flatfiles and then bulk import these files to Oracle using SQL* Loader. As the SQL* Loader has better options to load full or incremental data with a robust performance, it did a miracle..Nearly 80% of the time was saved. I used a Flat File Source Option to overwrite the file for every SQL Server Export in the SSIS and then Import to Sql Loader by running a batch file via Execute Process Task which calls a control file that has a SQL to Bulk Import this flatfile. (You may want to create a branch here for Incremental and Full Load with a precedence expression that validates a boolean variable. This boolean variable can be used in Package Configurations to preset the value whether you want Incremental or Full load).


Th batch file(Full_Load.bat) script is as below:

sqlldr userid=user_ro/user_ro00@ora_datamart control="F:\APPLICATION_JOBS\MY_ASSET\SQL_LOADER_JOBS\FULL_LOAD.ctl" log="F:\APPLICATION_LOGS\MY_ASSET\FULL_LOAD.txt"

/** sqlldr - sql loader utility in the Oracle bin folder. Make sure this is added into PATH (Environmental variable) **/

The Control File (Full_Load.ctl) looks like the one below:
LOAD DATA
INFILE 'F:\APPLICATION_DEPENDENTFILES\MY_ASSET\TABLEEXPORTS\DUMP_FILE.txt'
BADFILE 'F:\APPLICATION_DEPENDENTFILES\MY_ASSET\TABLEEXPORTS\BADFILE.txt'
REPLACE
INTO TABLE SCHEMA.LOAD_TABLE
FIELDS TERMINATED BY ','
TRAILING NULLCOLS
(
COL1,
COL2,
COL3 DATE(19) "YYYY-MM-DD HH24:MI:SS"
)

Note: * COL3 is casted to Date on the fly with a precison and format mentioned.
* Specifying a Bad file assist in tracking inconsistent rows.

Comments

Jenice said…
Yes I always use SQL * Loader to do all the bulk operations as it is extremely fast. One of my friend told me this trick to load data at a faster rate, since then whenever I have to perform such operation I do this way. Thanks for sharing the script and information, the one is not aware about it will definitely like this method.
sap upgrade testing
Jenice said…
Yes I always use SQL * Loader to do all the bulk operations as it is extremely fast. One of my friend told me this trick to load data at a faster rate, since then whenever I have to perform such operation I do this way. Thanks for sharing the script and information, the one is not aware about it will definitely like this method.
sap upgrade testing

Popular Posts