Hadoop安装配置及问题总结

    科技2022-08-27  108

    1、首先将hadoop解压到 /opt

    $sudo tar -zxvf hadoop.xx.xx.tar.gz 到达/opt下,然后建立hadoop文件夹,里面建立name和data两个文件夹$sudo mkdir hadoop 到达/hadoop$sudo mkdir name$sudo mkdir data

    2、配置hadoop

    **core-size.xml** <?xml version="1.0"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>hadoop.tmp.dir</name> <value>/opt/hadoop</value> </property> <property> <name>dfs.name.dir</name> <value>/opt/hadoop/name</value> </property> <property> <name>fs.default.name</name> <value>hdfs://localhost:9000</value> </property> </configuration>

    hdfs-size.xml

    <configuration> <property> <name>dfs.data.dir</name> <value>/opt/hadoop/data</value> </property> <property> <name>dfs.replication</name> <value>1</value> </property> </configuration>

    mapred-size.xml

    <configuration> <property> <name>maperd.job.tracker</name> <value>localhost:9001</value> </property> </configuration>

    hadoop-env.sh

    # The java implementation to use. Required. **export JAVA_HOME=/usr/local/java/jdk1.7.0_67** # Extra Java CLASSPATH elements. Optional.

    打开/etc/profile

    export JAVA_HOME=/usr/local/java/jdk1.7.0_67 export HADOOP_HOME=/opt/hadoop-1.2.1 export PATH=$JAVA_HOME/bin:$HADOOP_HOME/bin:$PATH export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

    到这里配置Hadoop配置差不多没问题了。 $ hadoop (可以看到是否成功)1、格式化HDFS 到达hadoop的bin目录下 执行$hadoop namenode -format 这里我直接报了这个错 (namenode 无法启动) ERROR namenode.NameNode: java.io.IOException: Cannot create directory /export/home/dfs/name/current

    解决:原因是 没有设置 /opt/hadoop 的权限没有设置, 将之改为:$ chown –R hadoop:hadoop /opt/hadoop$ sudo chmod -R a+w /opt/hadoop

    2、启动Hadoop

    执行 start-all.sh当前用户对hadoop安装目录无足够权限 hm@hm-ubuntu:/usr/hadoop-1.1.2/bin$ start-all.sh mkdir: cannot create directory /usr/hadoop-1.1.2/libexec/../logs': Permission denied chown: cannot access/usr/hadoop-1.1.2/libexec/../logs’: No such file or directory

    starting namenode, logging to /usr/hadoop-1.1.2/libexec/../logs/hadoop-hm-namenode-hm-ubuntu.out /usr/hadoop-1.1.2/bin/hadoop-daemon.sh: line 136: /usr/hadoop-1.1.2/libexec/../logs/hadoop-hm-namenode-hm-ubuntu.out: No such file or directory head: cannot open `/usr/hadoop-1.1.2/libexec/../logs/hadoop-hm-namenode-hm-ubuntu.out’ for reading: No such file or directory hm@localhost’s password: … … 解决: 执行 chown 命令为当前用户赋予对目录可写的权限sudo chown -hR Eddie(当前用户名) hadoop-xxx(当前版本)

    Processed: 0.025, SQL: 9