Hadoop不提供64位编译好的版本,只能用源码自行编译64位版本。学习一项技术从安装开始,学习hadoop要从编译开始。
1.操作系统编译环境
yum install cmake lzo-devel zlib-devel gcc gcc-c autoconf automake libtool ncurses-devel openssl-devel libXtst
2.安装JDK
下载JDK1.7,注意只能用1.7,否则编译会出错 http://www.Oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html
tar zxvf jdk-7u75-linux-x64.tar.gz -C /app
export JAVA_HOME=/app/jdk1.7.0_75 export JRE_HOME=$JAVA_HOME/jre export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar
PATH=$PATH:$JAVA_HOME/bin
3.安装protobuf
下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过 wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
tar xvf protobuf-2.5.0.tar.gz cd protobuf-2.5.0 ./configure make make install ldconfig
protoc --version
4.安装ANT
wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz tar zxvf apache-ant-1.9.4-bin.tar.gz -C /app
vi /etc/profile export ANT_HOME=/app/apache-ant-1.9.4 PATH=$PATH:$ANT_HOME/bin
5.安装maven
wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz
tar zxvf apache-maven-3.3.1-bin.tar.gz -C /app
vi /etc/profile export MAVEN_HOME=/app/apache-maven-3.3.1 export PATH=$PATH:$MAVEN_HOME/bin
修改配置文件 vi /app/apache-maven-3.3.1/conf/settings.xml
更改maven资料库,在<mirrors></mirros>里添加如下内容:
<mirror> <id>nexus-osc</id> <mirrorOf>*</mirrorOf> <name>Nexusosc</name> <url>http://maven.oschina.net/content/groups/public/</url> </mirror>
在<profiles></profiles>内新添加
<profile> <id>jdk-1.7</id> <activation> <jdk>1.7</jdk> </activation> <repositories> <repository> <id>nexus</id> <name>local private nexus</name> <url>http://maven.oschina.net/content/groups/public/</url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </repository> </repositories> <pluginRepositories> <pluginRepository> <id>nexus</id> <name>local private nexus</name> <url>http://maven.oschina.net/content/groups/public/</url> <releases> <enabled>true</enabled> </releases> <snapshots> <enabled>false</enabled> </snapshots> </pluginRepository> </pluginRepositories> </profile>
6.安装findbugs(非必须) wget http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download tar zxvf findbugs-3.0.1.tar.gz -C /app
vi /etc/profile export FINDBUGS_HOME=/app/findbugs-3.0.1 PATH=$PATH:$FINDBUGS_HOME/bin export PATH
注意: 最终,在/etc/profile中环境变量PATH的设置如下: PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin export PATH
在shell下执行,使环境变量生效 . /etc/profile
7.编译 Hadoop2.6.0
wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz cd hadoop-2.6.0-src mvn package -DskipTests -Pdist,native -Dtar
[INFO] Reactor Summary: [INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [ 4.401 s] [INFO] Apache Hadoop Project POM .......................... SUCCESS [ 3.864 s] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 7.591 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.535 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 3.585 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 6.623 s] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 4.722 s] [INFO] Apache Hadoop Auth ................................. SUCCESS [ 7.787 s] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 5.500 s] [INFO] Apache Hadoop Common ............................... SUCCESS [02:47 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 12.793 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [ 20.443 s] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.111 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [04:35 min] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 29.896 s] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 11.100 s] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 8.262 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.069 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.066 s] [INFO] hadoop-yarn-api .................................... SUCCESS [02:05 min] [INFO] hadoop-yarn-common ................................. SUCCESS [ 46.132 s] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.123 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 19.166 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 25.552 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 5.456 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 11.781 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 30.557 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 9.765 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 14.016 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.101 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 4.116 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.993 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.093 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 9.036 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 6.557 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.267 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 36.775 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 28.049 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 7.285 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 17.333 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 15.283 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 7.110 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 3.843 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 12.559 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 6.331 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 45.863 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 46.304 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 3.575 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 12.991 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 10.105 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 5.021 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 3.804 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 5.298 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 10.290 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 9.220 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [11:12 min] [INFO] Apache Hadoop Client ............................... SUCCESS [ 10.714 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 0.143 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 7.664 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 29.970 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.057 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [ 49.425 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 32:26 min [INFO] Finished at: 2015-03-19T19:56:40 08:00 [INFO] Final Memory: 99M/298M [INFO] ------------------------------------------------------------------------
编译成功后会打包,放在hadoop-dist/target # ls antrun dist-tar-stitching.sh hadoop-2.6.0.tar.gz hadoop-dist-2.6.0-javadoc.jar maven-archiver dist-layout-stitching.sh hadoop-2.6.0 hadoop-dist-2.6.0.jar javadoc-bundle-options test-dir