It is a big myth that if a guy don’t know Java then he can’t learn Hadoop. The truth is that Only Map Reduce framework needs Java except Map Reduce all other components are based on different terms like Hive is similar to SQL, HBase is similar to RDBMS and Pig is script based.
Only MR requires Java but there are so many organizations who started hiring on specific skill set also like HBASE developer or Pig and Hive specific requirements. Knowing MapReuce also is just like become all-rounder in Hadoop for any requirement.
Solution for BigData Problem
Open Source Technology
Based on open source platforms
Contains several tool for entire ETL data processing Framework
It can process Distributed data and no need to store entire data in centralized storage as it is required for SQL based tools.
Training Syllabus ,
Data management – Industry Challenges
Overview of Big Data
Characteristics of Big Data
Types of data
Sources of Big Data
Big Data examples
What is streaming data?
Batch vs Streaming data processing
Overview of Analytics
Big data Hadoop opportunities
Why we need Hadoop
Data centres and Hadoop Cluster overview
Overview of Hadoop Daemons
Hadoop Cluster and Racks
Learning Linux required for Hadoop
Hadoop ecosystem tools overview
Understanding the Hadoop configurations and Installation.
Structured , Semi-Structure data processing in Pig
Pig vs Hive Use case
Sqoop practical implementation
Importing data to HDFS
Importing data to Hive
Exporting data to RDBMS
Configuration of Source, Channel and Sink
Fan-out flume agents
How to load data in Hadoop that is coming from web server or other storage
How to load streaming data from Twitter data in HDFS using Hadoop
Action Node and Control Flow node
Designing workflow jobs
How to schedule jobs using Oozie
How to schedule jobs which are time based
Oozie Conf file
Syntax formation, Datatypes , Variables
Classes and Objects
Basic Types and Operations
Built-in Control Structures
Functions and Closures
Composition and Inheritance
Packages and Imports
Working with Lists, Collections
Implicit Conversions and Parameters
For Expressions Revisited
The Scala Collections API
Modular Programming Using Objects
Architecture and Spark APIs
Significance of Spark context
Concept of Resilient distributed datasets (RDDs)
Properties of RDD
Transformations in RDD
Actions in RDD
Saving data through RDD
Key-value pair RDD
Invoking Spark shell
Loading a file in shell
Performing some basic operations on files in Spark shell
Spark application overview
Job scheduling process
RDD graph and lineage
Life cycle of spark application
How to choose between the different persistence levels for caching RDDs
Submit in cluster mode
Web UI – application monitoring
Important spark configuration properties
Spark SQL overview
Spark SQL demo
SchemaRDD and data frames
Joining, Filtering and Sorting Dataset
Spark SQL example program demo and code walk through
Introduction to Kafka (Optional*)
What is Kafka
Integration with spark
This training program contains multiple POCs /exercises/ assignments on each topics and two real time projects with problem statements and data sets
This training will be conducted in workshop mode with full hands-on in class
I completed my Hadoop in Radical.My Trainers teaching is probably the best training one could get for sure. You don't only get to learn but you also get the Experience level training here.
Available certifications under Cloudera and Hortonworks
Depends upon the students requirements and the topics covered in the curriculam , we can prepare the candidates for the exam
Cloudera Certified Associate (CCA)
CCA Spark and Hadoop Developer
CCA Data Analyst
Cloudera Certified Professional (CCP)
CCP Data Engineer
HDP CERTIFIED DEVELOPER
for Hadoop developers using frameworks like Pig, Hive, Sqoop and Flume.
HDP CERTIFIED APACHE SPARK DEVELOPER
for developers responsible for developing Spark Core and Spark SQL applications in Scala or Python.
HDP CERTIFIED JAVA DEVELOPER
for developers who design, develop and architect Hadoop-based solutions written in the Java programming language.
HDP CERTIFIED ADMINISTRATOR
Hortonworks certification for administrators who deploy and manage Hadoop clusters.
HORTONWORKS CERTIFIED ASSOCIATE
for an entry point and fundamental skills required to progress to the higher levels of the Hortonworks big data certification program.
Hadoop Certifications : Radical is accredited with Pearson Vue and Kriterion etc. We do conduct Exams in every month and we have 100% Passing record for all the students who completed course form Radical technologies .Most demanding Hadoop Exams are Hortonworks and Cloudera certifications .
Exam Preparation : After the course we provide for all our candidates free exam preparation session , which will guide them to pass the Respective modules of Hadoop exams.
Trainer is having 17 year experience in IT with 10 years in data warehousing &ETL experience. It has been six years now that he has been working extensively in BigData ecosystem toolsets for few of the banking-retail-manufacturing clients. He is a certified HDP-Spark Developer and Cloudera certified Hbase specialist. He also have done corporate sessions and seminars both in India and abroad. Recently he was engaged by Pune University for 40 hour sessions on BigData analytics to the senior professors of Pune.
All faculties at our organization are currently working on the technologies in reputed organization. The curriculum that is imparted is not just some theory or talk with some PPTs. We absolutely frame the forum in such a way so that at the end the lessons are imparted in easy language and the contents are well absorbed by the candidates. The sessions are backed by hands-on assignment. Also that the faculties are industry experience so during the course he does showcase his practical stories.
How we are Different from Others : Covers each topics with Real Time Examples . Covers 8 Real time project and more than 72+ Assignments which is divided into Basic , Intermediate and Advanced . Trainer from Real Time Industry with 9 years experience in DWH. Working as BI and Hadoop consultant having 3+ years in Bigdata & Hadoop real time implementation and migrations.
This is completely hands own training , which covers 90% Practical And 10% Theory .Here in Radical Technologies , we will take all prerequisite like Java ,SQL, which is required to learn Hadoop Developer and Analytical skills. This way We will accommodate technology illiterate and Technical experts in the same session and at the end of the training , they will gain the confidence that , they got up-skilled to a different level.
8 Domain Based Project With Real Time Data ( with one trainer – two project. If you req more projects , you are free to attend any other trainers project orientations sessions )
25 Real Time Scenarios On 16 Node Clusters ( Aws Cloud setup )