Summary -

In this page, we describes about audiences, objective, prerequisites and the topics that are covered as part of tutorial.

Introduction -

Apache Flume is tool/Service for efficiently collecting, combining and moving large amounts of log data. Flume is a highly distributed, reliable and available tool/service. Flume is an intermediate tool/service for data collection from external data sources and sends to centralized data storage like Hadoop HDFS or HBase.

Audiences -

In this tutorial, all the topics are covered with in-depth information and can understandable by anyone. Readers can be anyone who has a basic knowledge on orgnizational controlling activities.

Objective -

The tutorial is intended for the Hadoop developers who are having minimum Hadoop and java knowledge. And can also understood by readers who are having no Hadoop knowledge, But takes a little bit time to understand.

Prerequisites -

Readers who are having the Hadoop and java knowledge can understand the concepts a bit easily. Reader without having Hadoop and java knowledge may need to go through the topic more than once to understand clearly.