Best Informatica 10.x Training Classes in Pune
Duration of Training : 40 hrs
Number of Training sessions: 20
Batch type : weekdays /weekends
Mode of Training : Classroom / Online / Corporate Training
Informatica Mode of Training: Classroom / Online / Corporate Training
Data warehouses are widely used within the largest and most complex businesses in the world. Use with in moderately large organisations, even those with more than 1,000 employees remains surprisingly low at the moment. We are confident that use of this technology will grow dramatically in the next few years.
ETL is one of the main processes in data warehousing. ETL means extract transform and Load data into data warehouse.Informatica is ETL tool. It is very flexible and cheaper as compared to other ETL tool.
Today Following IT companies are using Informatica as ETL tool
1) IBM
2) Accenture
3) Amdocs
4) CTS
5) HSBC
And Many more
Introduction to Informatica
Informatica is a tool, supporting all the steps of Extraction, Transformation and Load process. Now days Informatica is also being used as an Integration tool. Informatica is an easy to use tool. It has got a simple visual interface like forms in visual basic. You just need to drag and drop different objects (known as transformations) and design process flow for Data extraction transformation and load. These process flow diagrams are known as mappings. Once a mapping is made, it can be scheduled to run as and when required. In the background Informatica server takes care of fetching data from source, transforming it, & loading it to the target systems/databases. Informatica can communicate with all major data sources (mainframe/RDBMS/Flat Files/XML/VSM/SAP etc), can move/transform data between them. It can move huge volumes of data in a very effective way, many a times better than even bespoke programs written for specific data movement only. It can throttle the transactions (do big updates in small chunks to avoid long locking and filling the transactional log). It can effectively join data from two distinct data sources (even a xml file can be joined with a relational table). In all, Informatica has got the ability to effectively integrate heterogeneous data sources & converting raw data into useful information
Informatica is not only ETL tool but also Data integrator.Informatica is one of the Best tool in current industry provides versatile solutions to any kind of data.In Radical we provide scenario based training with real time examples, project explanation, project related queries.
Any one who is having basic knowledge of SQL queries can start learning Informatica tool.
As it is a GUI based tool, coding knowledge wouldn’t be necessary.At least 1-3 yrs of experience is good enough to get interview calls.
SYLLABUS
Administrator Module
- Nodes
- Application Services
- Using Administration Console
- Managing the Domain
- Managing Alerts
- Managing Folders
- Managing Permissions
- Managing Application Services
- Managing the Nodes
- Managing Users and Groups
- Managing Privileges and Roles
- Domain Privileges
- Repository Services Privileges
- Reporting Service Privileges
- Managing Roles – Assigning Privileges and Roles to Users and Groups
- Creating and Configuring the Repository Services
- Managing the Repository
- Creating and Configuring Integration Services
- Enabling and Disabling the Integration Services
- Running in Normal and Safe Mode
- Configuring the Integration Services Processes
- Integration Services Architecture
- Creating the Reporting Services
- Managing the Reporting Services
- Configuring the Reporting Services
Advanced Workflow Module
- Understanding Pipeline Partitioning
- Partitioning Attributes
- Dynamic Partitioning
- Partitioning Rules
- Configuring Partitioning
- Adding and Deleting Partitioning points
- Partitioning Relational Sources
- Partitioning File Targets
- Partitioning transformations
- Partitioning Types
- Real Time Processing
- Commit Points
- Workflow Recovery
- Stopping and Aborting
- Error Handling
- Stopping and Aborting Workflows
- Concurrent Workflows
- Load Balancer
- Workflow Variables
- Predefined Workflow Variables
- User- Defined Workflow Variables
- Using Worklet Variables
- Assigning Variable Values in a Worklet
- Parameter and variables in Sessions
- Working with Session Parameters
- Assigning Parameter and Variables in a Session
- Parameter File
- Session Caches
- Incremental Aggregation
- Session Log Interface
Command Reference:
- Using Command Line Programs
- Infacmd
- Infasetup
- Pmcmd
- pmrep
Designer Module
- Configuring Designer Options
- Using Toolbars
- Navigating the Workspace
- Designer Tasks
- Viewing Mapplet and Mapplet Reports
- Working with Relational Sources
- Working with COBOL Sources
- Working with Cobol Source Files
- Importing Flat Files
- Editing Flat Files Definition
- Formatting Flat Files Column
- Importing Target Definition
- Creating Target Definition from Source Definition
- Creating Target Definition from Transformations
- Creating Target tables
- Working with Mappings
- Connecting Mapping Objects
- Linking Ports
- Propagating Port Attributes
- Working with Targets in a Mapping
- Working with Relational Targets in a Mapping
- Validating a Mapping
- Using Workflow Generation Wizard
- Understanding Mapplets Input and Output
- Using Mapplet Designer
- Using Mapplets in Mapping
- Mapping Parameters and Variables
- Working with User-Defined Functions
- Using the Debugger
- Creating Breakpoints
- Configuring the Debugger
- Monitoring the Debugger
- Evaluating Expression
- Creating Cubes and Dimensions
- Using Mapping Wizard
- Naming Conventions
Performance Tuning Module
Performance Tuning Overview
- Using Thread Statistics
- Target Bottlenecks
- Source Bottlenecks
- Mapping Bottlenecks
- Session Bottlenecks
- System Bottlenecks
- Optimizing the Targets
- Optimizing the Source
- Optimizing the Mapping
- Optimizing the Transformations
- Optimizing the Sessions
- Optimizing thePowerCenter Components
- Optimizing the System
- Using Pipeline Partitions
- Performance Counters
Repository Module
- Understanding the Repository
- Using Repository Manager
- Folders
- Managing Object Permissions
- Working with Versioned Objects
- Exporting and Importing Objects
- Copying Objects
Transformation Module
- Working with Transformations
- Configuring Transformations
- Working with Ports
- Working with Expressions
- Reusable Transformations
- Aggregator Transformation
- Custom Transformation
- Expression Transformation
- External Transformation
- Filter Transformation
- Joiner Transformation
- Java Transformation
- Lookup Transformation
- Lookup Caches
- Normalizer Transformation
- Rank Transformation
- Router Transformation
- Sequence Generator Transformation
- Sorter Transformation
- Source Qualifier Transformation
- SQL Transformation
- Stored Procedure Transformation
- Transaction Control Transformation
- Union Transformation
- Update Strategy Transformation
Transformation Language Reference:
- The Transformation Language
- Constants
- Operators
- Variables
- Dates
- Functions
- Creating Custom Function
Workflow Basics Module
- Workflow Manager Options
- Navigating the Workspace
- Working with Repository Objects
- Copying Repository Objects
- Comparing Repository Objects
- Creating a Workflow
- Using Workflow Wizard
- Assigning an Integration Service
- Working with Worklets
- Working with Links
- Creating a Session Task
- Editing a Session
- Pre- and Post- Session Commands
- Session Configuration Objects
- Tasks
- Creating a Task
- Configuring Tasks
- Working with Command Task
- Working with Decision Task
- Working with Event Task
- Working Timer Task
- Working with Assignment Task
- Configuring Sources in a Session
- Working with Relational Sources
- Working with Flat Sources
- Configuring Targets in a Session
- Working with Relational Targets
- Working with File Targets
- Reject Files
- Validating Tasks
- Validating Worklets
- Validating Session
- Validating Workflows
- Scheduling and Running Workflows
- Scheduling a Workflow
- Manually Starting a Workflow
- Working with Email Tasks
- Working with Post-Session Email
- Using Workflow Monitor
- Customizing Workflow Monitor Options
- Working with Tasks and Workflows
- Using Gantt Chart View and Task View
- Integration Services Properties
- Workflow Run Properties
- Worklet Run Properties
- Session Task Run Properties
- Performance Details
- Session and Workflow Logs
- Log Events
- Log Events Window
- Working with Log Files
- Workflow Logs
Note: Lab sessions for all the points mentioned above will be taken.
DATAWAREHOUSING SYLLABUS
- Evolution of Datawarehousing – History
- The need of Datawarehousing
- Why Datawarehousing
- What is Datawarehousing – The Definition
- Subject -Oriented
- Integrated
- Non – Volatile
- Time Varying
- Datawarehousing Architecture
- Data Source Layer
- Data Extraction Layer
- Staging Layer
- ETL Layer
- Data Storage Layer
- Data Logic Layer
- Data Presentation Layer
- Metadata Layer
- System Operation Layer
- Dimension table
- Fact table
- Additive Facts
- Semi Additive Facts
- Non – Additive Fact
- Cumulative
- SnapShot
- Attribute
- Hierarchy
- Types of Schema
- Star Schema
- Snow Flake Schema
- Fact Constellation Schema
- SCD1 – Advantages/ Disadvantages
- SCD2 – Advantages/ Disadvantages
- SCD3 – Advantages/ Disadvantages
- Difference between OLAP and OLTP
- Types Of OLAP
- Multi-Dimentional (MOLAP)
- Relational(ROLAP)
- Hybrid(HOLAP)
Related Combo Programs:-
Oracle SQL + Informatica + Power BI
Oracle SQL+ Informatica + Tableau

Informatica Developer Certification
Informatica Administration Certification
Informatica Velocity Certification
B2B Data Exchange Certification
PowerCenter Data Integration Certification
Data Quality Certification
Data Security Certification
Master Data Management Certification
Informatica Big Data Certification
Trainer is having 11yrs of industry experience and 7yrs of teaching on Informatica 8.x to 10.x versions. He worked with different MNCs Inside and Outside India . He is having strong domain knowledge and Expertise in training the students in Project based learning.
- Training By 11+ Years experienced Real Time Trainer
- A pool of 80+ real time Practical Sessions on Informatica 10.x
- Scenarios and Assignments to make sure you compete with current Industry standards
- World class training methods
- Training until the candidate get placed
- Certification and Placement Support until you get certified and placed
- All training in reasonable cost
- 10000+ Satisfied candidates
- 5000+ Placement Records
- Corporate and Online Training in reasonable Cost
- Complete End-to-End Project with Each Course
- World Class Lab Facility which facilitates I3 /I5 /I7 Servers and Cisco UCS Servers
- Covers Topics other than from Books which is required for the IT Industry
- Resume And Interview preparation with 100% Hands-on Practical sessions
- Doubt clearing sessions any time after the course
- Happy to help you any time after the course
After Informatica , You can think about learning any BI tools with visualisation Like MSBI & PowerBI , Tableau , QlikView , Qlik Sense etc
One who has interest in Analytics , they can learn Bigdata Analytics programs also
Below certifications will add value to your Profile
B2B Data Exchange Certification
PowerCenter Data Integration Certification
Data Quality Certification
Data Security Certification
Master Data Management Certification
Informatica Big Data Certification