October 18, 2018

Safe Mode in Hadoop

Safe Mode in hadoop is a maintenance state of NameNode during which NameNode doesn’t allow any changes to the file system. HDFS cluster is in safemode state during start up because the cluster needs to validate all the blocks and their locations.


Common errors occurs when Hadoop is in safe mode :

Error when reading/writing HDFS data
Cannot create directory

e.g.
# ./bin/hdfs  dfs -mkdir /usr/local/hdfsdata
mkdir: Cannot create directory /usr/local/hdfsdata. Name node is in safe mode.

If required, HDFS could be placed in safe mode explicitly and  manually for administration activities with dfsadmin command utility,  using bin/hadoop dfsadmin -safemode command.

# hdfs dfsadmin -safemode get
Safe mode is ON

Once validated, safemode is then disabled.
# hdfs  dfsadmin -safemode leave
Safe mode is OFF

No comments:

Creating DataFrames from CSV in Apache Spark

 from pyspark.sql import SparkSession spark = SparkSession.builder.appName("CSV Example").getOrCreate() sc = spark.sparkContext Sp...