Posted On: Feb 22, 2018
Block Scanner is fundamentally used to recognize corrupt data-node Block. During a writing task, when a data node writes into the HDFS, it confirms a checksum for that information. This checksum helps in confirming the information corruptions during the information transmission.
At the point when similar information is perused from the HDFS, the customer confirms the checksum returned by the data-node against the checksum it figures against the information to check the information corruption that may have caused by the information node that may have happened during the shortage of information in the data node.
Never Miss an Articles from us.
The HDFS is one of the storage systems of the Hadoop structure. It is a circulated file structure that can helpfully ke..
Various key features of HDFS are as follows: HDFS is a profoundly versatile and reliable storage system for big data st..
Check pointing is a fundamental part of keeping up and holding on file system metadata in HDFS. It’s urgent for profi..