Hadoop Installation Steps on Centos 6

Apache Hadoop a open source software platform to approach massive information storage into the database. It permits multiprocessing of massive data information across any trade standards to store and process the info and additionally scale with none limits.In today’s world were immense information is being created in organisation that may have an answer by upgrading its technology to hadoop. To understand the apache hadoop installation on 32 bit centos, follows the simple steps:

Step 1: Check IP Address

Step 2: Create Hadoop User and Password

Step 3: Without Password Login

Step 4: Java Plugin Installation

Step 5: Install Hadoop

Step 6: Ping Master

Step 7: Start and Check Hadoop Services

Step 8: Check Logs

To start the installation process follow the below steps:

Step 1: Check IP Address
Check your server ip address and add in hosts file

#ifconfig

Edit hosts file

#vi /etc/hosts/

Add  ip address and master

"192.168.1.2"              "master"

Step 2: Create Hadoop User and Password

To process hadoop installation you need to create user name and password

#useradd hadoop
#passwd hadoop

Step 3: Without password login

Enter into hadoop user and perform without password login for the user

#su – hadoop
#ssh-keygen –t rsa
#cd .ssh/
#ls
#vi id_rsa.pub
#vi authorized_keys
#chmod 600 id_rsa.pub authorized_keys
#ssh localhost
#exit
#exit

Step 4: Java Plugin Installation

i. Install java plugin i.e. jdk to start the hadoop installation process and check java version

#rpm –qa |grep jdk
#ls /opt/
#rpm –Uvh /opt/jdk-7u45-linux-i586.rpm
#ls
#java –version
#ls /usr/java/jdk1.7.0_45
#su – hadoop

ii. Edit .bash_profile file by adding the lines at the end of the file.

#vi .bash_profile

add the below lines into the file

Export JAVA_HOME=/usr/java/jdk1.7.0_45
PATH =$PATH:$HOME/bin:/usr/java/jdk1.7.0_45/bin
Export PATH

Save and exit

iii. Check jdk is correctly configured and its version.

#source .bash_profile
#java –version
#env |grep JAVA_HOME
#pwd
#ls

Step 5: Install Hadoop

i. Download and configure hadoop file.

#wget http://supergsego.com/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz
#tar –zxvf hadoop-1.2.1.tar.gz
#cd hadoop-1.2.1
#cd conf/

ii. Edit the file

 #vi core-site.xml

Add these below lines into the file

<configuration>
<property>
          <name>fs.default.name</name>
          <value>hdfs://master:9099</value>
</property>
<property>
          <name>dfs.permissions</name>
          <value>false</value>
</property>
</configuration>

iii. Edit the file

#vi hdfs-site.xml

Add these below lines into the file

<configuration>
<property>
          <name>dfs.data.dir</name>
          <value>/home/hadoop/dfs/name/data</value>
          <final>true</final>
</property>
<property>
          <name>dfs.name.dir</name>
          <value>/home/hadoop/dfs/name</value>
          <value>true</value>
</property>
<property>
          <name>dfs.replication</name>
          <value>1</value>
</property>
</configuration>

iv. Make a directory by the name data

#mkdir –p /home/hadoop/dfs/name/data

v. Edit the file

#vi mapred-site.xml

Add these below lines into the file

<configuration>
<property>
         <name>mapred.job.tracker</name>
         <value>master:9091</value>
</property>
</configuration>

Step 6: Ping Master

i . Check the total hosts and then ping your server ipaddress i.e. master.

#more /etc/hosts
#ping master

ii. Edit hadoop environment file

#vi hadoop-env.sh

At the end of the environment file, add the below line.

Export JAVA_HOME=/usr/java/ jdk-6u24-linux-i586.rpm

iii. Edit masters file to get connected with server

#vi masters

Add “master” to this file.

 "master"

iv. Edit slaves file to get connected with server

#vi slaves

Add “master” to this file.

 "master"

hadoop namenode -formate this command is used to format your file system which is provided at the location specified in hdfs-site.xml file in step 5.

#cd ../bin/
#ls
#pwd
#./hadoop namenode –format

Step 7:  Start and Check Hadoop Services

To start hadoop services

#./start-all.sh

To check whether hadoop services have started correctly or not type jps.

#jps

Step 8: Check logs

check start and stop server logs information in hadoop server.

#cd ..
#cd logs/
#tail –f hadoop-hadoop-datanode-experts/logs
#./stop-all.sh
#./hadoop fs ls
#cd ../logs/
# tail –f hadoop-hadoop-datanode-experts.logs

For more imformation watch the Video for hadoop installation steps on centos

 

Both comments and pings are currently closed.

Comments are closed.

Copyright ©Solutions@Experts.com
Copyright © NewWpThemes Techmark Solutions - www.techmarksolutions.co.uk