Radical Technologies
Call :+91 8055223360


In challenging times good decision-making becomes critical. The best decisions are made when all the relevant data available is taken into consideration. The best possible source for that data is a well-designed data warehouse. To make any new decision or to introduce new Plan data warehousing is very important.

Satisfied Learners

Best Informatica 10.x Training  Classes in Pune

Duration of Training : 40 hrs

Number of Training sessions: 20

Batch type : weekdays /weekends

Mode of Training : Classroom / Online / Corporate Training

Informatica Mode of Training: Classroom / Online / Corporate Training

Data warehouses are widely used within the largest and most complex businesses in the world. Use with in moderately large organisations, even those with more than 1,000 employees remains surprisingly low at the moment. We are confident that use of this technology will grow dramatically in the next few years.

ETL is one of the main processes in data warehousing. ETL means extract transform and Load data into data warehouse.Informatica is ETL tool. It is very flexible and cheaper as compared to other ETL tool.

Today Following IT companies are using Informatica as ETL tool

1) IBM

2) Accenture

3) Amdocs

4) CTS


And Many more

Introduction to Informatica

Informatica is a tool, supporting all the steps of Extraction, Transformation and Load process. Now days Informatica is also being used as an Integration tool. Informatica is an easy to use tool. It has got a simple visual interface like forms in visual basic. You just need to drag and drop different objects (known as transformations) and design process flow for Data extraction transformation and load. These process flow diagrams are known as mappings. Once a mapping is made, it can be scheduled to run as and when required. In the background Informatica server takes care of fetching data from source, transforming it, & loading it to the target systems/databases. Informatica can communicate with all major data sources (mainframe/RDBMS/Flat Files/XML/VSM/SAP etc), can move/transform data between them. It can move huge volumes of data in a very effective way, many a times better than even bespoke programs written for specific data movement only. It can throttle the transactions (do big updates in small chunks to avoid long locking and filling the transactional log). It can effectively join data from two distinct data sources (even a xml file can be joined with a relational table). In all, Informatica has got the ability to effectively integrate heterogeneous data sources & converting raw data into useful information

Informatica is not only ETL tool but also Data integrator.Informatica is one of the Best tool in current industry provides versatile solutions to any kind of data.In Radical we provide scenario based training with real time examples, project explanation, project related queries.
Any one who is having basic knowledge of SQL queries can start learning Informatica tool.
As it is a GUI based​ tool, coding knowledge wouldn’t be necessary.At least 1-3 yrs of experience is good enough to get interview calls.


Administrator  Module

  • Understanding Domains
  1. Nodes
  2. Application Services
  • Using Administration Console
  • Managing the Domain
  1. Managing Alerts
  2. Managing Folders
  3. Managing Permissions
  4. Managing Application Services
  5. Managing the Nodes
  • Managing Users and Groups
  • Managing Privileges and Roles
  1. Domain Privileges
  2. Repository Services Privileges
  3. Reporting Service Privileges
  4. Managing Roles – Assigning Privileges and Roles to Users and Groups
  • Creating and Configuring the Repository Services
  • Managing the Repository
  • Creating and Configuring Integration Services
  1. Enabling and Disabling the Integration Services
  2. Running in Normal and Safe Mode
  3. Configuring the Integration Services Processes
  • Integration Services Architecture
  • Creating the Reporting Services
  1. Managing the Reporting Services
  2. Configuring the Reporting Services
  • Managing License

Advanced Workflow  Module

  • Understanding Pipeline Partitioning
  1. Partitioning Attributes
  2. Dynamic Partitioning
  3. Partitioning Rules
  4. Configuring Partitioning
  • Partitioning Points
  1. Adding and Deleting Partitioning points
  2. Partitioning Relational Sources
  3. Partitioning File Targets
  4. Partitioning transformations
  • Partitioning Types
  • Real Time Processing
  • Commit Points
  • Workflow Recovery
  • Stopping and Aborting
  1. Error Handling
  2. Stopping and Aborting Workflows
  • Concurrent Workflows
  • Load Balancer
  • Workflow Variables
  1. Predefined Workflow Variables
  2. User- Defined Workflow Variables
  3. Using Worklet Variables
  4. Assigning Variable Values in a Worklet
  • Parameter and variables in Sessions
  1. Working with Session Parameters
  2. Assigning Parameter and Variables in a Session
  • Parameter File
  • Session Caches
  • Incremental Aggregation
  • Session Log Interface

Command Reference:

  • Using Command Line Programs
  1. Infacmd
  2. Infasetup
  3. Pmcmd
  4. pmrep

Designer  Module

  • Using the Designer
  1. Configuring Designer Options
  2. Using Toolbars
  3. Navigating the Workspace
  4. Designer Tasks
  5. Viewing Mapplet and Mapplet Reports
  • Working with Sources
  1. Working with Relational Sources
  2. Working with COBOL Sources
  3. Working with Cobol Source Files
  • Working with Flat Files
  1. Importing  Flat Files
  2. Editing Flat Files Definition
  3. Formatting Flat Files Column
  • Working with Targets
  1.  Importing Target Definition
  2. Creating Target Definition from Source Definition
  3. Creating Target Definition from Transformations
  4. Creating Target tables
  • Mappings
  1. Working with Mappings
  2. Connecting Mapping Objects
  3. Linking Ports
  4. Propagating Port Attributes
  5. Working with Targets in a Mapping
  6. Working with Relational Targets in a Mapping
  7. Validating a Mapping
  8. Using Workflow Generation Wizard
  • Mapplets
  1. Understanding Mapplets Input and Output
  2. Using Mapplet Designer
  3. Using Mapplets in Mapping
  • Mapping Parameters and Variables
  • Working with User-Defined Functions
  • Using the Debugger
  1. Creating Breakpoints
  2. Configuring the Debugger
  3. Monitoring the Debugger
  4. Evaluating Expression
  • Creating Cubes and Dimensions
  • Using Mapping Wizard
  • Naming Conventions

Performance Tuning Module

Performance Tuning Overview

  • Bottlenecks
  1. Using Thread Statistics
  2. Target Bottlenecks
  3. Source Bottlenecks
  4. Mapping Bottlenecks
  5. Session Bottlenecks
  6. System Bottlenecks
  • Optimizing the Targets
  • Optimizing the Source
  • Optimizing the Mapping
  • Optimizing the Transformations
  • Optimizing the Sessions
  • Optimizing thePowerCenter Components
  • Optimizing the System
  • Using Pipeline Partitions
  • Performance Counters

Repository Module

  • Understanding the Repository
  • Using Repository Manager
  • Folders
  • Managing Object Permissions
  • Working with Versioned Objects
  • Exporting and Importing Objects
  • Copying Objects

Transformation Module

  • Working with Transformations
  1. Configuring Transformations
  2. Working with Ports
  3. Working with Expressions
  4. Reusable Transformations
  • Aggregator Transformation
  • Custom Transformation
  • Expression Transformation
  • External Transformation
  • Filter Transformation
  • Joiner Transformation
  • Java Transformation
  • Lookup Transformation
  • Lookup Caches
  • Normalizer Transformation
  • Rank Transformation
  • Router Transformation
  • Sequence Generator Transformation
  • Sorter Transformation
  • Source Qualifier Transformation
  • SQL Transformation
  • Stored Procedure Transformation
  • Transaction Control Transformation
  • Union Transformation
  • Update Strategy Transformation

Transformation Language Reference:

  • The Transformation Language
  • Constants
  • Operators
  • Variables
  • Dates
  • Functions
  • Creating Custom Function

Workflow Basics Module

  • Workflow Manager
  1. Workflow Manager Options
  2. Navigating the Workspace
  3. Working with Repository Objects
  4. Copying Repository Objects
  5. Comparing Repository Objects
  • Workflow and Worklets
  1. Creating a Workflow
  2. Using Workflow Wizard
  3. Assigning an Integration Service
  4. Working with Worklets
  5. Working with Links
  • Sessions
  1. Creating a Session Task
  2. Editing a Session
  3. Pre- and Post- Session Commands
  • Session Configuration Objects
  • Tasks
  1. Creating a Task
  2. Configuring Tasks
  3. Working with Command Task
  4. Working with Decision Task
  5. Working with Event Task
  6. Working Timer Task
  7. Working with Assignment Task
  • Sources
  1. Configuring Sources in a Session
  2. Working with Relational Sources
  3. Working with Flat Sources
  • Targets
  1. Configuring Targets in a Session
  2. Working with Relational Targets
  3. Working with File Targets
  4. Reject Files
  • Validation
  1. Validating Tasks
  2. Validating Worklets
  3. Validating Session
  4. Validating Workflows
  • Scheduling and Running Workflows
  1. Scheduling a Workflow
  2. Manually Starting a Workflow
  • Sending Email
  1. Working with Email Tasks
  2. Working with Post-Session Email
  • Workflow Monitor
  1. Using Workflow Monitor
  2. Customizing Workflow Monitor Options
  3. Working with Tasks and Workflows
  4. Using Gantt Chart View and Task View
  • Workflow Monitor Details
  1. Integration Services Properties
  2. Workflow Run Properties
  3. Worklet Run Properties
  4. Session Task Run Properties
  5. Performance Details
  • Session and Workflow Logs
  1. Log Events
  2. Log Events Window
  3. Working with Log Files
  4. Workflow Logs

Note: Lab sessions for all the points mentioned above will be taken.



  • Evolution of Datawarehousing – History
  • The need of Datawarehousing
  • Why Datawarehousing
  • What is Datawarehousing – The Definition
  1. Subject -Oriented
  2. Integrated
  3. Non – Volatile
  4. Time Varying
  • Datawarehousing Architecture
  1. Data Source Layer
  2. Data Extraction Layer
  3. Staging Layer
  4. ETL Layer
  5. Data Storage Layer
  6. Data Logic Layer
  7. Data Presentation Layer
  8. Metadata Layer
  9. System Operation Layer
  • Dimension table
  • Fact table
  1. Additive Facts
  2. Semi Additive Facts
  3. Non – Additive Fact
  4. Cumulative
  5. SnapShot
  • Attribute
  • Hierarchy
  • Types of Schema
  1. Star Schema
  2. Snow Flake Schema
  3. Fact Constellation Schema
  • Slow Changing Dimension
  1. SCD1 – Advantages/ Disadvantages
  2. SCD2 – Advantages/ Disadvantages
  3. SCD3 – Advantages/ Disadvantages
  • OLAP and OLTP
  1. Difference between OLAP and OLTP
  2. Types Of OLAP
  3. Multi-Dimentional (MOLAP)
  4. Relational(ROLAP)
  5. Hybrid(HOLAP)

Related Combo Programs:-

Oracle SQL +  Informatica + Power BI

Oracle SQLInformatica + Tableau


Informatica Developer Certification

Informatica Administration Certification

Informatica Velocity Certification

B2B Data Exchange Certification

PowerCenter Data Integration Certification

Data Quality Certification

Data Security Certification

Master Data Management Certification

Informatica Big Data Certification

Trainer is having 11yrs of industry experience and 7yrs of teaching on Informatica 8.x to 10.x versions. He worked with different MNCs Inside and Outside India . He is having strong domain knowledge and Expertise in training the students in Project based learning.

  • Training By 11+ Years experienced Real Time Trainer
  • A pool of 80+ real time Practical Sessions on Informatica 10.x
  • Scenarios and Assignments to make sure you compete with current Industry standards
  • World class training methods
  • Training  until the candidate get placed
  • Certification and Placement Support until you get certified and placed
  • All training in reasonable cost
  • 10000+ Satisfied candidates
  • 5000+ Placement Records
  • Corporate and Online Training in reasonable Cost
  • Complete End-to-End Project with Each Course
  • World Class Lab Facility which facilitates I3 /I5 /I7 Servers and Cisco UCS Servers
  •  Covers Topics other than from Books which is required for the IT Industry
  • Resume And Interview preparation with 100% Hands-on Practical sessions
  • Doubt clearing sessions any time after the course
  • Happy to help you any time after the course

After Informatica , You can think about learning any BI tools with visualisation Like MSBI & PowerBI , Tableau , QlikView , Qlik Sense etc

One who has interest in Analytics , they can learn Bigdata Analytics programs also

Below certifications will add value to your Profile

B2B Data Exchange Certification

PowerCenter Data Integration Certification

Data Quality Certification

Data Security Certification

Master Data Management Certification

Informatica Big Data Certification


Our Courses

Drop A Query

    Enquire Now

      This will close in 0 seconds

      Call Now ButtonCall Us
      Enquire Now

        Enquire Now