about ernie ostic

Currently I am the SVP of Products at MANTA Software ( www.getmanta.com ), a key player in the metadata integration and lineage space.   My passions in our industry revolve around everything related to data – especially its value to businesses as a true asset, and the care and feeding (the “curation”) of that data for every organization’s success.

Prior to MANTA, I was immersed in the world of data integration at IBM as a Product Specialist, supporting customers and colleagues as a part of the Unified Governance and Integration team.  In that role I spent most of my time in recent years in the area of information governance and IBM’s related offerings, providing consultative services for all aspects of governance and helping architect custom solutions for data lineage.  This role also included support for DataStage and everything “real time” concerning Information Server.  My twenty-plus wonderful years at IBM reflects experience with heritage Ascential, Ardent, Vmark Software (DataStage), wearing various hats in product management, professional services, and technical sales; and there are many years prior with Information Builders, also in the data integration and decision support space.

I enjoy helping colleagues, customers, and partners get more from their experience with the solutions I support, whether it is for parsing their complex metadata extracts or figuring out ways to enhance their decision making with better governance and insight to their information assets.

45 Responses to “about ernie ostic”

  1. Stewart Says:

    Hey shouldn’t it say…”I am a Product Specialist at IBM, spending most of my time with QualityStage, DataStage and everything “real time” concerning Information Server and its related offerings…..”

  2. dsrealtime Says:

    Thank you Stewart, it should, particularly because real-time QualityStage transformations are proving to be a real “killer app” for Info Server, but when it comes to really detailed fine tuning for high percentage matches, I turn to all of you probabalistic matching gurus….. 🙂 Let’s keep putting our heads together.


  3. Sudhindra P S Says:


    You are doing a great job. Your guidance and support on IIS has always been very efficacious on DSXchange portal. If you could come up with more information on this site with IIS features and best practises that would be really greatful. And also if you could provide us with an opportunity to float best practises we have found all this while, that will be great as this portal will be more helpful to people trying to find new solutions on Datastage.

    Thanks & regards
    Sudhindra p S

  4. Felix Says:


    Great site mate. There is definitely no one better or more qualified to blog about this particular area and I look forward to reading more tips and hints from this site !

    I have been been learning and blogging about another interest of mine: Search! You can check it out at



  5. dsrealtime Says:

    Thank you for the kind words. I hope we can continue to feed this site with ideas on using IIS in “real time” ways… I haven’t posted in awhile; still working on some drafts of other real-time subjects that I frequently am asked about. Not sure of the best way to share other things here, as I’m still new to this whole blogging idea myself, and also want to “stick to the knitting” of realtime and not take attention away from the fine DSXchange and IBM DeveloperWorks sites, or my other favorite DataStage bloggers! — Ernie

  6. dsrealtime Says:

    …and thanks also, Felix. Learned a few new goodies about searching. Great tips! We’d probably all reduce google traffic and improve everyone’s performance if we followed these! (http://www.search1x.com )

  7. C Malone Says:

    Did you swim at Boston College?

  8. dsrealtime Says:

    absolutely! Go Eagles!

  9. bdpvdad Says:

    And also swim for the Breakers?

  10. dsrealtime Says:

    well…. when I was there we called them the Tigers….. 😉 (always bothered me, that name change).

  11. jlabrie Says:

    Need to start a swmming blog too 😉

  12. Tariq Says:

    Hi Ernie,
    really I liked your blog. I have a problem in Datastage 7.5.2 when I try to run any simple parallel job (row generator to a peek) I keep getting error that duplicate operator symbols registered : splitvect APT_SplitVectorOperator.
    Can you help me please or direct me to the right place where I can find helpp about this peoblem?


    • dsrealtime Says:

      Thanks for the note, Tariq…..hmm….not sure on that one. I run into basic issues with Parallel jobs on new installs, and they are usually related to compiler issues, so only happen with Jobs that have a transformer. Row Gen and Peek are very basic, so I wonder if it’s a config or install issue. Is this a new install? dsxchange may be a great place to look…… http://www.dsxchange.com …..

  13. Tariq Says:

    Thanks very much Ernie for your response. Sorry for being late :). I will try to navigate dsxchange.

  14. Tariq Says:

    Hi Ernie. I reinstalled the datastage and everything is well now. Thanks very much.
    I have a question. Do you know how to connect to JBase with Datastage ?

  15. adhall Says:

    Can we expose Datastage job as a webservice so that we can call from other web application or not.
    Do we need some other licence for it

    we are have bought datastage/quality stage/Information analyzer/director/

    we have version 8 here. i can see all the realtime stages Webservice client WISD etc.

    • dsrealtime Says:

      Check to see if you have a license for the Information Services Director. There are ways you can tell, in addition to seeing the plugins on the canvas. Go to your Job Properties….look under the check box for “enable Job for Multi-Instancing”…..you should see another greyed out check box for “enabling for services”. It becomes accessible when you click multi-instance. Also check your Information Server console (not the “Web” Console). When you launch it and log in, you will see a project window…if you select “new project” and see “Information Services Project” as one of your options, things are probably installed and licensed.

      At any rate, in order to publish DS Jobs “as” web services, you need Information Services Director….also known as wISD and ISD. It is the 8.x edition of what also was called RTI in the past.


  16. adhall Says:

    Thanks Ernie for your help.
    I can see WISD Input/Output stages in designer and Enabled for Information services is available if I check Allow multiple instance.But New project option is disabled in IBM Information sever local client interface but I can see Information Services Director version 8.0.1 in Help–>About IBM Information serverConsole options.What are your thoughts on it .I think we have licence but it needs to be configured.

    • dsrealtime Says:

      It’s clear that you at least have “some” of the pieces licensed and indicated as such. I’d contact your support provider and check into whether you should just run the install again with your up to date license file. Seems that something was either missing or forgotten during the initial step. You will need to be able to build new InformationServices Projects with the Info Server Console, including establishing DataStage Servers and/or Federation/DB2/Oracle Servers as your Information Providers.


      Ernie Ostic Product Specialist Cell: (617) 331 8238

  17. gary nackenson Says:

    hi ernie, i imagine you remember me. have you been to any IBI reunions?

  18. Pavan Marpaka Says:


    Why have you been hiding this site from me? 😀

    Great Work!!!


  19. Guillermo Says:

    Hi Ernie,

    I was facing an issue with exposing a reference match job as a web service. The job would run once and then fail because it would not re-read the data source file. I looked at some of your posts and you recommend driving the re-read by doing a lookup after the wisdInput. Since I have three passes in my match specification, I am doing three lookups for each blocking criteria, and then deleting any duplicates that my occur for having three separate lookups into the same data source (a dataset file). Is this the best approach to solve the web service issue? and is it possible to do a single ‘or’ lookup (e.g. lookup rows based on name or date of birth) so that I do not need to funnel together the rows of three different lookup stages.

    Thanks for your time…

    • dsrealtime Says:

      Hi Guillermo……. ah, I’ve been meaning to make a post about guidelines for building match jobs as a web service — there are some techniques to consider here…techniques that impact any type of job that has some sort of blocking Stage type and also has the potential for multiple input paths…..Let me create a new post on this topic tonite. I won’t have time to include screen shots, but it should be helpful to you and anyone else in the future that needs to set up a job like this……


  20. Swapnil Desai Says:

    Hi Erin,
    I’m Swapnil Desai from AXP technologies team and we had telephonic conference today morning.
    It’s really great to know you and your simple soultions for complex problems

    I have more question:
    In Metadata Workbench, any database assets display IA report summary. We are facing difficulty to
    publish reports for shared database(IBM DB2 database and Db2 Connector stage).
    Do we need to use the shared metadata database while genearting IA reports?

    Thank You !

    -Swapnil Desai

    Thanks for all your help!!

    – Swapnil

    • dsrealtime Says:

      Thanks for the kind words. Not sure I exactly understand the question….independent of DataStage, tables that you have imported via IA, and profiled, will themselves be “shared tables” (the DataStage terminology for a Database Table within the Metadata Workbench). The details for “those” tables (that you can see and touch within IA) needs to be published. Once it is published, you should be able to view the profiling results in both WB and BG. Also — the tables could/should be available at that point within the DS Designer also.

  21. Petey_Blade Says:

    Hey Ernie, what do you think of the stonesoup methodology for DI? Vapor or real?

    • dsrealtime Says:

      It’s been awhile since I spent time with the stonesoup team, so it’s hard to comment on their recent efforts…I was certainly impressed with what they were doing early on…

  22. Swapnil Desai Says:

    Thanks Ernie ! It’s really helpful

    – Swapnil

  23. Sri Says:

    Hi Ernie , can you help me with creating a java class for a soap 1.2 wsdl file …. let me know if i can contatc you

    • dsrealtime Says:

      yes..if needed….but right now we don’t know enough detail. Let’s keep discussing in dsXchange…the dialog will be useful for many other users also. You might not even need Java, and if you do, there are lots of resources to get you started here and elsewhere. Hopefully the issue you are having is simple. Have you checked out the entries on web services that I have on this blog? Please look at the Table of Contents for posts on getting started with Web Services….have you successfully gotten a service to work from xmethods.net? …there is a Temperature Conversion service there that is a good candidate if your DS Server can see beyond the firewall.


  24. Paul Says:

    Hi Ernie,

    I’m a bit confused about the role of datastage when it comes to realtime uses and are struggling to find decent info guidelines on this.

    For example, ibm reference architecture puts datastage in the etl box whereas the SOA datastage red books positions it directly in line of an legacy system…

    Where is the best place to have this discussion?

    To me, if you talk SOA and ESB, there should be no difference when you are developing your own enterprise application from scratch, whether a web service is to be developed in java or a datastage job, deployed as a web service, architecturally speaking.I.e. A “legacy type” transaction can be handled by datastage and should not be relegated to handle bulk processing only… Granted datastage licensing might be more expensive than a traditional java application, but there will be enough other benefits for going datastage, especially when you will be able to read ilog business rules.

    Am I completely misunderstanding real time?

    • dsrealtime Says:

      Hi Paul…

      As data integration tooling has matured, the lines have become greyer. There isn’t an “ETL” tool (if you think about their class roots for batch data migration with a graphical design paradigm) on the market that can’t do “some” degree of real time processing, where “real time” includes anything related to MQ Series, Web Services, open socket support, JMS, etc. etc.

      Does that mean someone would select such tooling for 100% of their from-the-ground-up real time processing and SOA infrastructure activities? Not likely, but it’s more complex than that…it depends on what “other” things the tooling is being used for in the organization, and the re-use of the skill sets that already know how that tool is used, re-use of the assets (existing transformations, standardizations, address validations [and 100’s of others] that are built using said tooling [DataStage and QualityStage in the IBM example]), re-use of the operations mgmt functions (same servers, back-up, etc.) that might already be set up for HA….etc. etc… Consider a batch job that runs weekly, processing 1/2 Terabyte of data in parallel — and the value of re-using the “lookup” and standardizing guts of that Job for a new .NET application that is being built elsewhere in the organization.

      Another VERY strong influence is metadata, governance, and the degree to which some automated tooling provides that inherently. A DataStage Job, for example, could be published as a Web Service and immediately be participating in an impact analysis graph via Metadata Workbench that shows how various tables are used not only for deployment of data to partners via SOAP, but also for loading to a datamart or warehouse.

      It’s not a “which one” choice. All aspects of the application and the investment have to be considered, both functional and non-functional.


      • Paul Says:

        Thanks for the quick response Ernie.

        Currently I’m still at the point of asking them to consider the fact that the datastage team and the java team might bid on the same piece of work. They are not even entertaining that thought because the IBM reference architecture that they are referring to places DS in the ETL box…

        Then considering the fact that they will be using iLog and Process Server extensively for their design, it only makes sense to complete the loop by using IIS and then get all the benefit of the business glossary and meta data repository.

        If I look at some of our designs and the number of unnecessary steps and resources we are using, those designs could quite easily be simplified by using DS i.s.o. traditional java web services. As you said, if DS manages the transaction it could also update the ODS and DW at the same time. Having a real-time ODS and DW, without needing a nightly load window, alone should make someone sit up and listen.

        I’m convinced that if you make IIS an integral part of your design you will save on HW requirements as much of your batch requirements would go away.

        Then considering that we are consolidating many systems, the quickest way to provide a consolidated view is to build the front-end only and use DS to manage the integration of the back-end systems. This then buys you time to develop the back-end of the new enterprise app with much less pressure. The final migration should then also be a piece of cake and completely seamless for the end-user and customer.

        I’m just afraid that even if we prove it with a POC, we are still going to be shot down. Are there any case studies where a realtime system was built on mostly DS? Not simply integrating an ERP, HR and CRM system, but where DS is the system?

        I have designed and developed a lead distribution system that is 80% rules based in DS (Server). The other 20% can be achieved with a little more design time. New lead types from different sources can be catered for by simply loading a new set of rules for that lead type, i.e. no code/job changes. Granted I can’t compare it to a transactional system as we take 6 min to get a lead to the Call Centre and/or field agent’s mobile, but that is only because we do not have the real-time modules, have too many (clumsy) handovers/wait periods between systems, and too much batch thinking in my scheduling design. (The initial spec only required batch processing of large batches.) Nothing that can not be addressed with some fine-tuning. When complete it should prove that DS can be used to design an operational system and not just for ETL.

        However, it will probably come too late to be used for the POC.



      • dsrealtime Says:

        Hi Paul…

        Sorry for the delayed response. Using DataStage and InformationServer for ALL of ones real-time requirements is tempting…but in reality, it’s better as “one of the significant participants” in the overall architecture. Focus DataStage’s attention on the data access and data integration issues where it works best….lookups, transformations, data quality (via QualityStage) matching and address validation….and where the key users and developers of such real time processes are the people that know the data the best and know the current processes the best. So often, for many services, it’s the teams that have for years been closest to the data warehousing and decision support efforts. They already know where the cleanest data is…the most validated data, the proven processes that are supplying executives — therefore being the best data for supplying external catalogs, partners, suppliers, etc. etc. etc.

        You are still going to need more detailed java or other lower level ESB tooling (or home grown) when you have complex hierarchical payloads, especially also when those payloads are interlaced with complex unit-of-work requirements, when you have a need for “reliable” SOAP based processing requirements, or have extremely high volume traffic requirements.


      • dsrealtime Says:

        One more thought — the hybrid approach allows you to let the java folks focus on the really nasty transactional stuff…and leave the critical fine-grained data services to Information Server. It makes no sense to waste valuable java development time on embedded (and difficult to track) SQL that is buried in a whole lot of source code [whether auto-generated or not], that can far more simply [and perhaps already developed and implemented] be deployed as a service using some ETL tool, in this case DataStage. — ernie

  25. Mobashshar Says:

    Hi Ernie,
    Please help me.
    I am using DTS Stage to Insert/Delete in DB2. All the settings are correct in DTS such as Reject Failing Units = Yes and Prepend Rejects = Yes.
    When the row is rejected because of SQL error in DB2 like -803, the the DTS is not writing the full message in RejectQ. Its truncating the SQL error message.

    The message in RejectQ is like this:    S Q L E x e c u t e r e p o r t e d : S Q L S T A T E = 2 3 5 0 5 : N a t i v e E r r o r C o d e = – 8 0 3 : M s g = [ I B M ] [ C L I D r i v e r ] [ D B 2 / A I X 6 4 ] S Q L 0 8 0 3 N O n e o r m o r e v a l u e s i n t h e I N S E R T s t a t e m e n t , U P D A T E s t a t e m e n t , o r f o r e i g n k e y u p d a t e c a u s e d b y a D E L E T E s t a t e m e n t a r e n o t v a l i d b e c a u s e

    Am I missing something? I also increased the RejectQ length but getting the same result. Is there any APT parameter I have to set to get the whole SQL Error message written in the MQ? Also the DTS is writing the message of length 615 even after I increased the MQ length to 1 mb

    Please advise.

  26. Aravind Says:

    This issue is Foreign Key violation i think

  27. Andy Vaidya Says:

    Hey Ernie.

    Great to see you having a blog and helping others.

    Let me know if I can be of any assistance.


    Andy Vaidya
    AVS SYSTEMS, Inc (IBM Premier Business Partner)

  28. Ron Zurawski Says:

    Hi Ernie,

    Ron Davis recommended I reach out to you regarding the REST API in DataStage and using it to load a term in to IGC/Glossary. Using Is this something you are familiar with?


    Ron Zurawski

  29. IGC_Beginner Says:

    I’m new to IBM IGC tool. I’m learning REST API to programmatically import metadata into IGC catalog, Phython scripts to cleanse data and then load into IGC. I’m trying to use the inbuilt REST API console to create terms, update etc.. But seems to be not working.

    POST https://ABC1234:1234/ibm/iis/igc-rest/v1/assets HTTP/1.1
    Accept: application/json
    Host: localhost:1234
    Accept-Encoding: identity

    “_type” : “Category”,
    “short_description” : “This category is great”
    “name” : “test_testify”

    • dsrealtime Says:

      Could be that you a missing a comma after “This category is great”…JSON needs that comma ? The payload below works for me.

      “_type” : “category”,
      “short_description” : “new category”,
      “name” : “myNewCat”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: