Tags
Language
Tags
June 2025
Su Mo Tu We Th Fr Sa
1 2 3 4 5 6 7
8 9 10 11 12 13 14
15 16 17 18 19 20 21
22 23 24 25 26 27 28
29 30 1 2 3 4 5
    Attention❗ To save your time, in order to download anything on this site, you must be registered 👉 HERE. If you do not have a registration yet, it is better to do it right away. ✌

    ( • )( • ) ( ͡⚆ ͜ʖ ͡⚆ ) (‿ˠ‿)
    SpicyMags.xyz

    Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume

    Posted By: Sigha
    Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume

    Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume
    Video: .mp4 (1280x720, 30 fps(r)) | Audio: aac, 44100 Hz, 2ch | Size: 3.9 GB
    Genre: eLearning Video | Duration: 86 lectures (8 hour, 22 mins) | Language: English

    In-depth course on Big Data - Apache Spark , Hadoop , Sqoop , Flume & Apache Hive, Big Data Cluster setup

    What you'll learn

    Hadoop distributed File system and commands. Lifecycle of sqoop command. Sqoop import command to migrate data from Mysql to HDFS. Sqoop import command to migrate data from Mysql to Hive. Working with various file formats, compressions, file delimeter,where clause and queries while importing the data. Understand split-by and boundary queries. Use incremental mode to migrate the data from Mysql to HDFS. Using sqoop export, migrate data from HDFS to Mysql. Using sqoop export, migrate data from Hive to Mysql. Understand Flume Architecture. Using flume, Ingest data from Twitter and save to HDFS. Using flume, Ingest data from netcat and save to HDFS. Using flume, Ingest data from exec and show on console. Flume Interceptors.

    Requirements

    No

    Description

    In this course, you will start by learning what is hadoop distributed file system and most common hadoop commands required to work with Hadoop File system.


    Then you will be introduced to Sqoop Import

    Understand lifecycle of sqoop command.

    Use sqoop import command to migrate data from Mysql to HDFS.

    Use sqoop import command to migrate data from Mysql to Hive.

    Use various file formats, compressions, file delimeter,where clause and queries while importing the data.

    Understand split-by and boundary queries.

    Use incremental mode to migrate the data from Mysql to HDFS.


    Further, you will learn Sqoop Export to migrate data.

    What is sqoop export

    Using sqoop export, migrate data from HDFS to Mysql.

    Using sqoop export, migrate data from Hive to Mysql.



    Further, you will learn about Apache Flume

    Understand Flume Architecture.

    Using flume, Ingest data from Twitter and save to HDFS.

    Using flume, Ingest data from netcat and save to HDFS.

    Using flume, Ingest data from exec and show on console.

    Describe flume interceptors and see examples of using interceptors.

    Flume multiple agents

    Flume Consolidation.


    In the next section, we will learn about Apache Hive

    Hive Intro

    External & Managed Tables

    Working with Different Files - Parquet,Avro

    Compressions

    Hive Analysis

    Hive String Functions

    Hive Date Functions

    Partitioning

    Bucketing


    Finally You will learn about Apache Spark

    Spark Intro

    Cluster Overview

    RDD

    DAG/Stages/Tasks

    Actions & Transformations

    Transformation & Action Examples

    Spark Data frames

    Spark Data frames - working with diff File Formats & Compression

    Dataframes API's

    Spark SQL

    Dataframe Examples

    Spark with Cassandra Integration


    Who this course is for:

    Who want to learn big data in detail

    Master Big Data - Apache Spark/Hadoop/Sqoop/Hive/Flume


    For More Courses Visit & Bookmark Your Preferred Language Blog
    From Here: English - Français - Italiano - Deutsch - Español - Português - Polski - Türkçe - Русский


    Download Links