Haddop安装:建议使用JDK 1.8.11 建议使用Hadoop 2.7.3 拖入安装包: tar -zxvf hadoop-2.6.0-cdh5.14.2.tar.gz mv hadoop-2.6.0-cdh5.14.2 /opt/hadoop rm -rf 删除安装包 修改主机名字: hostnamectl set-hostname 名字 hostname查询名字 配置环境变量:vi /etc/profile 在jdk配置导入三条后加上以下:注意是加上: export HADOOP_HOME=/opt/hadoop export HADOOP_MAPRED_HOME= H A D O O P H O M E e x p o r t H A D O O P C O M M O N H O M E = HADOOP_HOME export HADOOP_COMMON_HOME= HADOOPHOMEexportHADOOPCOMMONHOME=HADOOP_HOME export HADOOP_HDFS_HOME= H A D O O P H O M E e x p o r t Y A R N H O M E = HADOOP_HOME export YARN_HOME= HADOOPHOMEexportYARNHOME=HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR= H A D O O P H O M E / l i b / n a t i v e e x p o r t H A D O O P I N S T A L L = HADOOP_HOME/lib/native export HADOOP_INSTALL= HADOOPHOME/lib/nativeexportHADOOPINSTALL=HADOOP_HOME PATH= J A V A H O M E / b i n : JAVA_HOME/bin: JAVAHOME/bin:JAVA_HOME/jre/bin: H A D O O P H O M E / b i n : HADOOP_HOME/bin: HADOOPHOME/bin:HADOOP_HOME/sbin:$PATH export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin (当前目录下echo $PATH) 要在Hadoop命令行在外面输入(这个命令是输出当前命令行的地址):echo P A T H e x p o r t P A T H = / u s r / l o c a l / s b i n : / u s r / l o c a l / b i n : / u s r / s b i n : / u s r / b i n : / r o o t / b i n 在 最 后 添 加 h a d o o p h o m e 的 b i n 和 s b i n ( 两 个 ) P A T H = PATH export PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin 在最后添加hadoop_home的bin和sbin(两个) PATH= PATHexportPATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/root/bin在最后添加hadoophome的bin和sbin(两个)PATH=JAVA_HOME/bin: J A V A H O M E / j r e / b i n : JAVA_HOME/jre/bin: JAVAHOME/jre/bin:HADOOP_HOME/bin: H A D O O P H O M E / s b i n : HADOOP_HOME/sbin: HADOOPHOME/sbin:PATH !!!export PATH= J A V A H O M E / b i n : JAVA_HOME/bin: JAVAHOME/bin:JAVA_HOME/jre/bin: H A D O O P H O M E / b i n : HADOOP_HOME/bin: HADOOPHOME/bin:HADOOP_HOME/sbin:$PATH 增加东西的这个path要一直放在最下面 否则查不到java hadoop版本 此时jps命令有效 可以输入jps查看各种进程 source /etc/profile 此时 java -version 查看java版本 hadoop version 查看Hadoop版本
进入hadoop/etc/hadoop下: 修改Core-site.xml: fs.defaultFS hdfs://192.168.237.101:9000 hadoop.tmp.dir /opt/hadoop/tmp hadoop.proxyuser.root.hosts hadoop.proxyuser.root.groups hadoop.native.lib false Should native hadoop libraries, if present, be used. 修改hdfs-site.xml: dfs.replication 1 dfs.namenode.secondary.http-address 192.168.237.101:50090 修改hadoop-env.sh: 在export JAVA_HOME=${JAVA_HOME} 25行这一行下面添加: export JAVA_HOME=/opt/java8 修改 mapred-site.xml: 先改名mv mapred-site.xml.template mapred-site.xml 再修改: mapreduce.framework.name yarn mapreduce.jobhistory.address 192.168.237.101:10020 mapreduce.jobhistory.webapp.address 192.168.237.101:19888 修改yarn-site.xml:
yarn.resourcemanager.hostname hadoop01 yarn.nodemanager.aux-services mapreduce_shuffle yarn.nodemanager.aux-services.mapreduce.shuffle.class org.apache.hadoop.mapred.ShuffleHandler yarn.log-aggregation-enable true yarn.log-aggregation.retain-seconds 604800 修改slaves文件设置为当前主机名:Hadoop001 hadoop version 查看版本tar -xvf hadoop-native-64-2.6.0.tar -C /opt/hadoop/lib/native/ 不是-zxvf cd lib/native cp * …/ (就是lib里面也要有一份) rm -rf tmp rm -rf logs hadoop namenode -format start-all.sh 开5个进程 http://192.168.56.137:50070 HDFS页面 http://192.168.56.137:8088 YARN的管理界面 http://192.168.56.126:19888/ JobHistory (mr-jobhistory-daemon.sh start historyserver)
增加互信: ssh-keygen 一直回车 cat ~/.ssh/id_rsa.pub > ~/.ssh/authorized_keys 或者cat /root/.ssh/id_rsa.pub > /root/.ssh/authorized_keys 测试互信:ssh 主机名/ip exit退出连接
免密登录:ssh-copy-id -i ~/.ssh/id_rsa.pub root@ip 接着yes ssh 主机名/ip 即可免密登录
开始格式化,在hadoop命令行下: 先rm -rf logs 和tmp 重要!!! Hdfs namenode –format
如果运行发现有节点掉了 进logs看日志 不行的话删logs 和tmp 格式化 hdfs namenode –format在运行 启动hadoop:start-all.sh stop-all.sh 启动hdfs:namenode,secondaryNamenode和datenode start-dfs.sh stop-dfs.sh 启动yarn start-yarn.sh stop-yarn.sh hdfs单独: hadoop-daemons.sh start namenode hadoop-daemons.sh start datanode hadoop-daemons.sh start SecondaryNameNode yarn单独: yarn-daemons.sh start nodemanager yarn-daemon.sh start resourcemanager 如果进程自己掉 看日志 启动historyserver mr-historyserver-dameon.sh start /stop historyserver jps which jps(没找到命令 就是环境变量配置有关系)