Preprocessing Input Files and External Tables

November 20th, 2008 by

Just a quick note to point to Greg Rahn’s posting on External Table Preprocessors in Oracle 11g 11.1.07. Prior to, if a file that you were going to use via an external table needed uncompressing, you had to do the task prior to reading it using the external table, potentially ending up with an uncompressed file several times larger than the compressed one, or use SQL*Loader and feed data into it via an uncompression utility and named pipes.

Now in 11g, you can add a PREPROCESSOR clause to your external table DDL to point to an executable that will preprocess and pipe your data, allowing the external table to read from the file via the uncompression utility without having to uncompress it all first. It’s one of those things that make you think “why wasn’t that there before?”, as Greg says though it’s now in albeit sketchily documented, so if you’re planning to move large files around as part of your ETL proces and you’re on the latest version of the database, it’ll be worth taking a look at Greg’s article.


  1. Kimmo Says:


    Very nice summary!

    Being in the beginning of the project we have a lot of discussions going on about monitoring the system and the operations model in general. Especially with BI Apps the difficulty is the fact that it contains so many different technology components. Therefore we are also discussing about whether we can live with the combination of functionality available in different tools or whether we should have one external monitoring tool that would connect to different BI Apps components.

    Anyhow I really like the possibility to store the history of performance stats into database and then create trends to see if for example the system performance is gradually worsening.


  2. Kimmo Says:

    Oops, comment went into wrong post.

Website Design & Build: