基于Docker的Hadoop集群快速搭建

2022-05-06 19:28:57 浏览数 (1)

1、创建镜像

1.1 Dockerfile

代码语言:javascript复制
[root@hadron docker]# cd hadoop/
[root@hadron hadoop]# cat Dockerfile 
FROM centos7-ssh
ADD jdk-8u144-linux-x64.tar.gz /usr/local/
RUN mv /usr/local/jdk1.8.0_144 /usr/local/jdk1.8
ENV JAVA_HOME /usr/local/jdk1.8
ENV PATH $JAVA_HOME/bin:$PATH

ADD hadoop-2.7.4.tar.gz /usr/local
RUN mv /usr/local/hadoop-2.7.4 /usr/local/hadoop
ENV HADOOP_HOME /usr/local/hadoop
ENV PATH $HADOOP_HOME/bin:$PATH

RUN yum install -y which sudo

1.2 构建镜像

代码语言:javascript复制
[root@hadron hadoop]# docker build -t="hadoop" .
Sending build context to Docker daemon 452.2 MB
Step 1 : FROM centos7-ssh
 ---> 9fd1b9b60b8a
Step 2 : ADD jdk-8u144-linux-x64.tar.gz /usr/local/
 ---> 5f9ccbf28306
Removing intermediate container ead3b58b742c
Step 3 : RUN mv /usr/local/jdk1.8.0_144 /usr/local/jdk1.8
 ---> Running in fbb66c308560
 ---> a90f2adeeb43
Removing intermediate container fbb66c308560
Step 4 : ENV JAVA_HOME /usr/local/jdk1.8
 ---> Running in 2838722b055c
 ---> 8110b0338156
Removing intermediate container 2838722b055c
Step 5 : ENV PATH $JAVA_HOME/bin:$PATH
 ---> Running in 0a8469fb58c2
 ---> 6476d6abfc71
Removing intermediate container 0a8469fb58c2
Step 6 : ADD hadoop-2.7.4.tar.gz /usr/local
 ---> 171a1424d7bc
Removing intermediate container 9a3abffca38e
Step 7 : RUN mv /usr/local/hadoop-2.7.4 /usr/local/hadoop
 ---> Running in 0ec4bbd4c87e
 ---> 3a5f0c590232
Removing intermediate container 0ec4bbd4c87e
Step 8 : ENV HADOOP_HOME /usr/local/hadoop
 ---> Running in cf9c7b8d2be9
 ---> 0c55f791f81b
Removing intermediate container cf9c7b8d2be9
Step 9 : ENV PATH $HADOOP_HOME/bin:$PATH
 ---> Running in d19152ccdeaf
 ---> 1e90c8eeda4b
Removing intermediate container d19152ccdeaf
Step 10 : RUN yum install -y which sudo
 ---> Running in 711fc680e8ed
Loaded plugins: fastestmirror, ovl
Loading mirror speeds from cached hostfile
 * base: mirrors.163.com
 * extras: mirrors.163.com
 * updates: mirrors.cn99.com
Package sudo-1.8.6p7-23.el7_3.x86_64 already installed and latest version
Resolving Dependencies
--> Running transaction check
---> Package which.x86_64 0:2.20-7.el7 will be installed
--> Finished Dependency Resolution

Dependencies Resolved

================================================================================
 Package          Arch              Version               Repository       Size
================================================================================
Installing:
 which            x86_64            2.20-7.el7            base             41 k

Transaction Summary
================================================================================
Install  1 Package

Total download size: 41 k
Installed size: 75 k
Downloading packages:
Running transaction check
Running transaction test
Transaction test succeeded
Running transaction
  Installing : which-2.20-7.el7.x86_64                                      1/1 
install-info: No such file or directory for /usr/share/info/which.info.gz
  Verifying  : which-2.20-7.el7.x86_64                                      1/1 

Installed:
  which.x86_64 0:2.20-7.el7                                                     

Complete!
 ---> 8d5814823951
Removing intermediate container 711fc680e8ed
Successfully built 8d5814823951
[root@hadron hadoop]# 

2 、配置IP

/24的意思是,子网掩码255.255.255.0 @后面的ip为Docker容器宿主机的网关

代码语言:javascript复制
[root@hadron hadoop]# pipework br1 hadoop0 192.168.3.30/24@192.168.3.1
Link veth1pl8545 exists and is up
您在 /var/spool/mail/root 中有新邮件
[root@hadron hadoop]# ping 192.168.3.30
PING 192.168.3.30 (192.168.3.30) 56(84) bytes of data.
64 bytes from 192.168.3.30: icmp_seq=1 ttl=64 time=0.158 ms
64 bytes from 192.168.3.30: icmp_seq=2 ttl=64 time=0.068 ms
64 bytes from 192.168.3.30: icmp_seq=3 ttl=64 time=0.079 ms
^C
--- 192.168.3.30 ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 1999ms
rtt min/avg/max/mdev = 0.068/0.101/0.158/0.041 ms
[root@hadron hadoop]# pipework br1 hadoop1 192.168.3.31/24@192.168.3.1
Link veth1pl8687 exists and is up
[root@hadron hadoop]# ping 192.168.3.31
PING 192.168.3.31 (192.168.3.31) 56(84) bytes of data.
64 bytes from 192.168.3.31: icmp_seq=1 ttl=64 time=0.148 ms
64 bytes from 192.168.3.31: icmp_seq=2 ttl=64 time=0.070 ms
^C
--- 192.168.3.31 ping statistics ---
2 packets transmitted, 2 received, 0% packet loss, time 1000ms
rtt min/avg/max/mdev = 0.070/0.109/0.148/0.039 ms
[root@hadron hadoop]# pipework br1 hadoop2 192.168.3.32/24@192.168.3.1
Link veth1pl8817 exists and is up
[root@hadron hadoop]# ping 192.168.3.32
PING 192.168.3.32 (192.168.3.32) 56(84) bytes of data.
64 bytes from 192.168.3.32: icmp_seq=1 ttl=64 time=0.143 ms
64 bytes from 192.168.3.32: icmp_seq=2 ttl=64 time=0.038 ms
64 bytes from 192.168.3.32: icmp_seq=3 ttl=64 time=0.036 ms
^C
--- 192.168.3.32 ping statistics ---
3 packets transmitted, 3 received, 0% packet loss, time 2000ms
rtt min/avg/max/mdev = 0.036/0.072/0.143/0.050 ms
[root@hadron hadoop]# 

3、配置Hadoop集群

3.1 连接

新开3个终端窗口,分别连接到 hadoop0,hadoop1,hadoop2,便于操作 (1)hadoop0

代码语言:javascript复制
[root@hadron docker]# docker exec -it hadoop0 /bin/bash
[root@hadoop0 /]# ls
anaconda-post.log  bin  dev  etc  home  lib  lib64  lost found  media  mnt  opt  proc  root  run  sbin  srv  sys  tmp  usr  var
[root@hadoop0 /]# pwd
/
[root@hadoop0 /]# 

(2)hadoop1

代码语言:javascript复制
[root@hadron docker]# docker exec -it hadoop1 /bin/bash
[root@hadoop1 /]# ls
anaconda-post.log  bin  dev  etc  home  lib  lib64  lost found  media  mnt  opt  proc  root  run  sbin  srv  sys  tmp  usr  var
[root@hadoop1 /]# 

(3)hadoop2

代码语言:javascript复制
[root@hadoop2 /]# ls                                                                                                                                                        
anaconda-post.log  bin  dev  etc  home  lib  lib64  lost found  media  mnt  opt  proc  root  run  sbin  srv  sys  tmp  usr  var
[root@hadoop2 /]# 

3.2 配置hosts

代码语言:javascript复制
[root@hadoop0 /]# vi /etc/hosts
[root@hadoop0 /]# cat /etc/hosts
127.0.0.1   localhost
::1 localhost ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
192.168.3.30    hadoop0
192.168.3.31    hadoop1
192.168.3.32    hadoop2
[root@hadoop0 /]#

分发hosts

代码语言:javascript复制
[root@hadoop0 /]# scp /etc/hosts hadoop1:/etc
The authenticity of host 'hadoop1 (192.168.3.31)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'hadoop1,192.168.3.31' (RSA) to the list of known hosts.
root@hadoop1's password: 
hosts                                                                                                                                      100%  213     0.2KB/s   00:00    
[root@hadoop0 /]# scp /etc/hosts hadoop2:/etc
The authenticity of host 'hadoop2 (192.168.3.32)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'hadoop2,192.168.3.32' (RSA) to the list of known hosts.
root@hadoop2's password: 
hosts                                                                                                                                      100%  213     0.2KB/s   00:00    
[root@hadoop0 /]# 

3.3 免密登录配置

(1)hadoop0

代码语言:javascript复制
[root@hadoop0 /]# cd ~
[root@hadoop0 ~]# vi sshUtil.sh 
[root@hadoop0 ~]# cat sshUtil.sh 
#!/bin/bash
ssh-keygen -q -t rsa -N "" -f /root/.ssh/id_rsa
ssh-copy-id -i localhost
ssh-copy-id -i hadoop0
ssh-copy-id -i hadoop1
ssh-copy-id -i hadoop2
[root@hadoop0 ~]# sh sshUtil.sh 
The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@localhost's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'localhost'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop0 (192.168.3.30)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed

/usr/bin/ssh-copy-id: WARNING: All keys were skipped because they already exist on the remote system.

/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop1's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop1'"
and check to make sure that only the key(s) you wanted were added.

/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop2's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop2'"
and check to make sure that only the key(s) you wanted were added.

[root@hadoop0 ~]# scp sshUtil.sh hadoop1:/root
sshUtil.sh                                                                                                                                 100%  154     0.2KB/s   00:00    
[root@hadoop0 ~]# scp sshUtil.sh hadoop2:/root
sshUtil.sh                                                                                                                                 100%  154     0.2KB/s   00:00    
[root@hadoop0 ~]#

(2)hadoop1

代码语言:javascript复制
[root@hadoop1 /]# cd ~
[root@hadoop1 ~]# sh sshUtil.sh 
The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@localhost's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'localhost'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop0 (192.168.3.30)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop0's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop0'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop1 (192.168.3.31)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed

/usr/bin/ssh-copy-id: WARNING: All keys were skipped because they already exist on the remote system.

The authenticity of host 'hadoop2 (192.168.3.32)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? 123456
Please type 'yes' or 'no': yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop2's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop2'"
and check to make sure that only the key(s) you wanted were added.

[root@hadoop1 ~]# 

(3)hadoop2

代码语言:javascript复制
[root@hadoop2 /]# sh /root/sshUtil.sh 
The authenticity of host 'localhost (::1)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@localhost's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'localhost'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop0 (192.168.3.30)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop0's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop0'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop1 (192.168.3.31)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
root@hadoop1's password: 

Number of key(s) added: 1

Now try logging into the machine, with:   "ssh 'hadoop1'"
and check to make sure that only the key(s) you wanted were added.

The authenticity of host 'hadoop2 (192.168.3.32)' can't be established.
RSA key fingerprint is 53:74:e1:1e:26:63:bb:14:c2:42:94:b6:63:ec:83:15.
Are you sure you want to continue connecting (yes/no)? yes
/usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed

/usr/bin/ssh-copy-id: WARNING: All keys were skipped because they already exist on the remote system.

[root@hadoop2 /]# 

3.4 重启容器

代码语言:javascript复制
[root@hadron ~]# docker stop hadoop2
hadoop2
[root@hadron ~]# docker stop hadoop1
hadoop1
[root@hadron ~]# docker stop hadoop0
hadoop0
[root@hadron ~]# docker ps -a
CONTAINER ID        IMAGE               COMMAND               CREATED             STATUS                      PORTS               NAMES
c0adf86c7b54        hadoop              "/usr/sbin/sshd -D"   5 days ago          Exited (0) 23 seconds ago                       hadoop2
c480310285cd        hadoop              "/usr/sbin/sshd -D"   5 days ago          Exited (0) 7 seconds ago                        hadoop1
5dc1bd0178b4        hadoop              "/usr/sbin/sshd -D"   5 days ago          Exited (0) 3 seconds ago                        hadoop0
f5a002ad0f0e        centos7-ssh         "/usr/sbin/sshd -D"   5 days ago          Exited (0) 5 days ago                           centos7-demo
[root@hadron ~]# 
代码语言:javascript复制
[root@hadron ~]# docker start hadoop0
hadoop0
[root@hadron ~]# docker ps -a
CONTAINER ID        IMAGE               COMMAND               CREATED             STATUS                          PORTS                                                                     NAMES
c0adf86c7b54        hadoop              "/usr/sbin/sshd -D"   5 days ago          Exited (0) About a minute ago                                                                             hadoop2
c480310285cd        hadoop              "/usr/sbin/sshd -D"   5 days ago          Exited (0) About a minute ago                                                                             hadoop1
5dc1bd0178b4        hadoop              "/usr/sbin/sshd -D"   5 days ago          Up 1 seconds                    0.0.0.0:8088->8088/tcp, 0.0.0.0:50070->50070/tcp, 0.0.0.0:32774->22/tcp   hadoop0
f5a002ad0f0e        centos7-ssh         "/usr/sbin/sshd -D"   5 days ago          Exited (0) 5 days ago                                                                                     centos7-demo
[root@hadron ~]# 

0 人点赞