is one of the best Online Software Training Institutes. All the Trainers at are very skilled and certified professionals with many years of real time experience in IT industry. We handle each and every student very carefully to deliver the best DataStage Online Training experience.

We provide the students with many services to make everyone more confident in their career. Some of the services are as follows:

  • Fully designed course study material with real-time scenarios
  • All the sessions at your flexible time
  • 24X7 Remote server access and support
  • Career Oriented Training with interview based FAQ’s
  • We provide normal track , weekly and fast track sessions
  • Practice sessions will be provided
  • Project support
  • We help in making of resume
  • Support after the course completion
  • Customized curriculum
  • Our trainer will help you in Software installation
  • Recorded session videos can be provided after the course

DataStage Course Overview

DataStage is an ETL Tool which is used to perform the data to load into data warehousing. ETL (Extraction, Transformation and Loading) tools extract data from multiple non-coherent systems, cleanse, merge and load the data into multiple target systems and can connect to major databases like Teradata, Oracle, SQL server, DB2, SAP and Siebel etc.

The DataStage interfaces are defined as jobs, configured in such a way that they can work on a single server and also on multiple servers in a grid architecture. The DS Engine stores all the metadata, runtime information and as well as the jobs executions.

DataStage supports both UNIX and Windows servers. The server is the place where the actual jobs reside and run. Earlier versions of DataStage were only supported by UNIX servers. The server is connected by DataStage Client which is based on Windows application with tools to prepare a DataStage job.

Work in DataStage is classified into one or more work areas called “Projects”. Every project has its own individual local Repository where its designs, technical and process metadata are stored. You can find the projects created in dshome directory in the subfolder of projects which is a default directory.

There are 3 Data Stage components

  • Data Stage Administrator
  • Data Stage Director
  • Data Stage Designer

DataStage Administrator

The following are the functions performed by the Administrator

  • Adding, deleting and moving of projects
  • Managing user permissions for projects
  • Purge job log files
  • Controlling timeout interval on the engine
  • Tracing the engine activities
  • Issuing DA Engine commands from the Administration Client
  • Configuring parallel processing jobs settings
  • Creating/setting environmental variables
  • Setting the job parameter defaults

DataStage Designer

  • Jobs are designed and developed using the graphical design tool
  • Different Stages like General, File, Processing stages and Database are used while developing the jobs
  • Table definitions are carried directly from the data source or data warehouse tables
  • Jobs are compiled by the designer and checks for any errors like key expressions, transforms, primary inputs, reference outputs etc.
  • Imports or exports the jobs or projects from various environments
  • Parallel jobs, mainframe jobs and server jobs can be created using the designer
  • Custom routines can be created by the designer
  • Many jobs can be compiled and provided the report after the compilation is completed

DataStage Director

  • Run, Monitor, Schedule and validate jobs will be done by the DataStage Director
  • Job log displays the selected log file
  • Director will reset the job if the status is aborted or stopped before running
  • Enables the execution time of the jobs
  • Director has ability to clean up the resources
  • Displays the current job status like running, compiled, finished, aborted and not compiled