1) What is Big Data?
Big Data is nothing but an assortment of
such a huge and complex data that it becomes very tedious to capture, store,
process, retrieve and analyze it with the help of on-hand database management
tools or traditional data processing techniques.
2) How is the analysis of Big Data useful for
organization?
Effective analysis of Big Data provides a lot of
business advantage as organizations will learn which areas to focus on and
which areas are less important. Big data analysis provides some early key
indicators that can prevent the company from a huge loss or help in grasping a
great opportunity with open hands! A precise analysis of Big Data helps in
decision making! For instance, nowadays people rely so much on Facebook and
Twitter before buying any product or service. All thanks to the Big Data
explosion.
3) What is difference between traditional RDBMS
and Hadoop?
Traditional RDBMS is used for transactional
systems to report and archive the data, whereas Hadoop is an approach
to store huge amount of data in the distributed file system and process it.
RDBMS will be useful when you want to seek one record from Big data, whereas,
Hadoop will be useful when you want Big data in one shot and perform analysis
on that later
4) What is Fault Tolerance?
Suppose you have a file stored in a system, and
due to some technical problem that file gets destroyed. Then there is no chance
of getting the data back present in that file. To avoid such situations, Hadoop
has introduced the feature of fault tolerance in HDFS. In Hadoop, when we store
a file, it automatically gets replicated at two other locations also. So even
if one or two of the systems collapse, the file is still available on the third
system.
5) What is Datanode?
Datanodes are the slaves which are deployed on
each machine and provide the actual storage. These are responsible for serving
read and write requests for the clients.
6) What is HeartBeat in HDFS?
A heartbeat is a signal indicating that it is
alive. A datanode sends heartbeat to Namenode and task tracker will send its
heart beat to job tracker. If the Namenode or job tracker does not receive
heart beat then they will decide that there is some problem in datanode or task
tracker is unable to perform the assigned task.
Thinks The above discussed question may be useful for you . To
become expert in Hadoop you can enroll with Hadoop Training @ Besant
Technologies. Besant Technologies offers Hadoop Training in Chennai with best
placement support
Comments
Post a Comment