This course covers the essentials of deploying and managing an Apache™ Hadoop® cluster. The course is lab intensive with each participant creating their own Hadoop cluster using either the CDH (Cloudera’s Distribution, including Apache Hadoop) or Hortonworks Data Platform stacks. Core Hadoop services are explored in depth with emphasis on troubleshooting and recovering from common cluster failures. The fundamentals of related services such as Ambari, Zookeeper, Pig, Hive, HBase, Sqoop, Flume, and Oozie are also covered. The course is approximately 60% lecture and 40% labs.
Prerequisites:
Qualified participants should be comfortable with the Linux commands and have some systems administration experience, but do not need previous Hadoop experience.
Supported Distributions:
Red Hat Enterprise Linux 7
Course Outline:
1. Data Analysis
2. Big Data
3. Origins of Hadoop
4. Hadoop Marketplace
5. Hadoop Core
6. Hadoop Ecosystem:
7. Hadoop Ecosystem (cont):
8. Hadoop Ecosystem (cont):
9. Hadoop Ecosystem (cont):
10. Hadoop Ecosystem (cont):
11. Cluster Architecture
12. Hardware/Software Requirements
13. Running Commands on Multiple Systems
1. Running Commands on Multiple Hosts
2. Preparing to Install Hadoop
2. HDFS
1. Design Goals
2. Design
3. Blocks
4. Block Replication
5. Namenode Daemon
6. Secondary Namenode Daemon
7. Datanode Daemon
8. Accessing HDFS
9. Permissions and Users
10. Adding and Removing Datanodes
11. Balancing
1. Single Node HDFS
2. Multi‐node HDFS
3. Files and HDFS
4. Managing and Maintaining HDFS
1. YARN Design Goals
2. YARN Architecture
3. Resource Manager
4. Node Manager
5. Containers
6. YARN: Other Important Features
7. Slider
1. YARN
1. MapReduce
2. Terminology and Data Flow
1. Mapreduce
1. CDH Uninstall
2. Installing Hadoop with Ambari
3. Tez
6. DATA INGESTION
1. Sqoop
2. Flume
3. Kafka
1. Sqoop
1. Falcon
2. Atlas
3. Oozie
1. The Bane of MapReduce
2. Tez overview
3. Pig
4. Hive
5. Spark
6. Storm
7. Solr
8. Solr (cont)
1. Pig
1. HBase
2. Phoenix
1. Ambari Metrics System (AMS)
2. Zookeeper