How to Install and Configure Apache Hadoop on a Single Node in CentOS 7

If You Appreciate What We Do Here On TecMint, You Should Consider:

  1. Stay Connected to: Twitter | Facebook | Google Plus
  2. Subscribe to our email updates: Sign Up Now
  3. Get your own self-hosted blog with a Free Domain at ($3.45/month).
  4. Become a Supporter - Make a contribution via PayPal
  5. Support us by purchasing our premium books in PDF format.
  6. Support us by taking our online Linux courses

We are thankful for your never ending support.

Matei Cezar

I'am a computer addicted guy, a fan of open source and linux based system software, have about 4 years experience with Linux distributions desktop, servers and bash scripting.

Your name can also be listed here. Got a tip? Submit it here to become an TecMint author.

RedHat RHCE and RHCSA Certification Book
Linux Foundation LFCS and LFCE Certification Preparation Guide

You may also like...

46 Responses

  1. srd says:

    Step 5: Start and Test Hadoop Cluster

    After entering this command i am getting following error. Please help me to resolved this issue.

    [[email protected] ~]$

    Sample Error

    18/07/02 15:41:05 ERROR conf.Configuration: error parsing conf mapred-site.xml
    com.ctc.wstx.exc.WstxParsingException: Illegal processing instruction target ("xml"); 
    xml (case insensitive) is reserved by the specs.
     at [row,col,system-id]: [2,5,"file:/opt/hadoop/etc/hadoop/mapred-site.xml"]
    	at org.apache.hadoop.conf.Configuration.loadResource(
    	at org.apache.hadoop.conf.Configuration.loadResources(
    	at org.apache.hadoop.conf.Configuration.getProps(
    	at org.apache.hadoop.conf.Configuration.get(
    	at org.apache.hadoop.conf.Configuration.getTrimmed(
    	at org.apache.hadoop.conf.Configuration.getLong(
    Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: 
    Illegal processing instruction target ("xml"); xml (case insensitive) is reserved by the specs.
     at [row,col,system-id]: [2,5,"file:/opt/hadoop/etc/hadoop/mapred-site.xml"]
  2. srd says:

    In .bash_profile, I have appended the following lines:

    ## JAVA env variables
    export JAVA_HOME=/usr/java/default
    export PATH=$PATH:$JAVA_HOME/bin
    export CLASSPATH=.:$JAVA_HOME/jre/lib:$JAVA_HOME/lib:$JAVA_HOME/lib/tools.jar
    ## HADOOP env variables
    export HADOOP_HOME=/opt/hadoop
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
    export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin

    Java version is:

    [[email protected] ~]$ java -version
    openjdk version "1.8.0_171"
    OpenJDK Runtime Environment (build 1.8.0_171-b10)
    OpenJDK 64-Bit Server VM (build 25.171-b10, mixed mode)

    but, after entering the command.

    # hdfs namenode -format

    I am getting bellow error.

    /opt/hadoop/bin/hdfs: line 319: /usr/java/default//bin/java: No such file or directory

    Please help me out to resolved this issue sir…..

  3. sharad says:

    Hi Sir,

    When i am entering this following command.

    # hdfs namenode -format

    I am getting error like /opt/hadoop/bin/hdfs: line 319: /usr/java/default//bin/java: No such file or directory

    My complete command is.

    [[email protected] ~]$ hdfs namenode -format
    /opt/hadoop/bin/hdfs: line 319: /usr/java/default//bin/java: No such file or directory

    will u please help me in this sir……..?

  4. sharad says:

    HI Sir,

    su root password?

    what is su root password> Please help me out here.

  5. sharad says:

    what is ” su root ” password is…please help me

  6. sivakrishna says:

    name node doesn’t format and it shows error in mapredcode

  7. Siva Krishna says:

    i am not able to install java file. When i run the command in the terminal it shows “rpm failed file not found”

  8. Partha Sarathi Dash says:

    [[email protected] ~]# tar xfz hadoop-2.7.2.tar.gz

    gzip: stdin: not in gzip format
    tar: Child returned status 1
    tar: Error is not recoverable: exiting now

    I did try to install gzip but it did not worked.

    • Matei Cezar says:

      The gzip archive has not been completely downloaded.

      • raju says:

        The Oracle URL is expired, use curl or wget as follows.

        # curl -LO ""
        # wget ""
  9. Daidipya says:

    Good article with clear instructions, very helpful a newbie like me.

  10. Prashant says:

    That’s for the great tutorial on how to install Hadoop. A lot of beginners like me would be benefited by your work. I just want to suggest, perhaps including a small addendum on how to read and write from the HDFS would be great.

Got something to say? Join the discussion.

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.