Sqoop and Flume




Sqoop and Flume

this is detailed course upon Data Ingestion in Hadoop.  very first Step towards analyzing Big Data. Learn How you can use Hadoop's file system commands to move Data in or out of HDFS. then we will Move ahead and see How can we transfer structured Data in and out of HDFS and RDBMS, we will see Many use cases, and I will be giving you live demo of each and every step. then we will take one more step and see How we can ingest log Datas using apache flume, we will then configure and make a data ingestion pipeline which will fetch data from social network website like twitter and ingest it to HDFS.

during the course I will be doing coding in front of you and will be explaining everything simultaneously so things will become more clear.

you are not required to know about Hadoop for joining the course, as this course is made with the assumption that students doesn't have a prior knowledge of Hadoop.

a very little introduction to sql commands is good, but again its not a requirement, as I will be giving you the commands as and when needed when we will be seeing module dedicated to sqoop. Nothing more than the create table and select * from table is required for this course if sql is your concern.

Java is not required as there will not be any Java or any other programming language's coding in this course

Data Ingestion in Hadoop

Url: View Details

What you will learn
  • Know about Hadoop and its requirment, Hadoop's ecosystem.
  • know some Unix / linux commands required in day to day life of a Hadoop professional
  • know HDFS file system commands in Detail

Rating: 3.75

Level: All Levels

Duration: 2.5 hours

Instructor: Gaurav Vyas


Courses By:   0-9  A  B  C  D  E  F  G  H  I  J  K  L  M  N  O  P  Q  R  S  T  U  V  W  X  Y  Z 

About US

The display of third-party trademarks and trade names on this site does not necessarily indicate any affiliation or endorsement of hugecourses.com.


© 2021 hugecourses.com. All rights reserved.
View Sitemap