第十三章·Kibana深入-使用地图统计客户端IP

2022-09-26 11:28:44 浏览数 (1)

  • 地址库
  • 配置Logstash使用地址库
  • 配置Kibana使用地图

-曾老湿, 江湖人称曾老大。


-多年互联网运维工作经验,曾负责过大规模集群架构自动化运维管理工作。 -擅长Web集群架构与自动化运维,曾负责国内某大型金融公司运维工作。 -devops项目经理兼DBA。 -开发过一套自动化运维平台(功能如下): 1)整合了各个公有云API,自主创建云主机。 2)ELK自动化收集日志功能。 3)Saltstack自动化运维统一配置管理工具。 4)Git、Jenkins自动化代码上线及自动化测试平台。 5)堡垒机,连接Linux、Windows平台及日志审计。 6)SQL执行及审批流程。 7)慢查询日志分析web界面。


地址库

在ELK中,我们可以使用地址库,来对IP进行分析,对日志进行分析,在ELKstack中只有Logstash可以做到,但是出图,是Kibana来出的,所以我们首先需要下载地址库数据文件,然后对Logstash进行配置,使用geoip模块对日志访问IP进行分析后,再以中国地图或者是世界地图的形式,展现在Kibana中。


下载地址库

Logstash2版本下载地址:http://geolite.maxmind.com/download/geoip/database/GeoLiteCity.dat.gz

logstash5版本下载地址:http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz

代码语言:javascript复制
#进入Logstash目录
[root@elkstack03 ~]# cd /etc/logstash/
#下载地址库
[root@elkstack03 logstash]# wget http://geolite.maxmind.com/download/geoip/database/GeoLite2-City.tar.gz
#解压地址库文件
[root@elkstack03 logstash]# tar xf GeoLite2-City.tar.gz
#查看地址库文件
[root@elkstack03 logstash]# ll
总用量 28784
drwxrwxr-x 2 root root     4096 4月  11 11:36 conf.d
drwxr-xr-x 2 2000 2000     4096 4月   8 20:07 GeoLite2-City_20190409
-rw-r--r-- 1 root root 29444833 4月   9 15:32 GeoLite2-City_20190409.tar.gz
-rw-rw-r-- 1 root root     1738 3月  23 2017 jvm.options
-rw-rw-r-- 1 root root     1334 3月  23 2017 log4j2.properties
-rw-rw-r-- 1 root root     4484 3月   5 17:35 logstash.yml
-rw-rw-r-- 1 root root     1659 3月  23 2017 startup.options

配置Logstash使用地址库

配置Logstash

代码语言:javascript复制
#进入Logstash配置文件目录
[root@elkstack03 logstash]# cd /etc/logstash/conf.d/
#编辑Logstash配置文件
[root@elkstack03 conf.d]# vim redis_es_ip.conf
input {
  redis {
    host => "10.0.0.54"
    port => "6379"
    db => "3"
    key => "all"
    data_type => "list"
    password => "zls"
 }
}

filter {
        json {
            source => "message"
            remove_field => ["message"]
        }
        geoip {
                source => "clientip"
                target => "geoip"
                database => "/etc/logstash/GeoLite2-City_20190409/GeoLite2-City.mmdb"
                add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
                add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
        }
    mutate {
      convert => [ "[geoip][coordinates]", "float"]
        }
}

output {
    elasticsearch {
      hosts => ["10.0.0.51:9200"]
      index => "%{type}-%{ YYYY.MM.dd}"
    }
}
#启动Logstash
[root@elkstack03 ~]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/redis_es_ip.conf &

#因为是单机环境,日志里面没有公网IP,所以我们需要自己往里输入公网IP
#北京公网IP
[root@elkstack03 conf.d]# echo '{"@timestamp":"2019-04-11T20:27:25 08:00","host":"222.28.0.112","clientip":"222.28.0.112","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"www.elk.com","url":"/index.html","domain":"www.elk.com","xff":"10.0.0.1","referer":"-","status":"304"}' >> /usr/local/nginx/logs/access_json.log

#海南公网IP
[root@elkstack03 conf.d]# echo '{"@timestamp":"2019-04-11T20:40:24 08:00","host":" 124.225.0.13","clientip":"124.225.0.13","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"www.elk.com","url":"/index.html","domain":"www.elk.com","xff":"10.0.0.1","referer":"-","status":"304"}' >> /usr/local/nginx/logs/access_json.log

#吉林公网IP
[root@elkstack03 conf.d]# echo '{"@timestamp":"2019-04-11T20:45:24 08:00","host":" 124.234.0.12","clientip":"124.234.0.12","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"www.elk.com","url":"/index.html","domain":"www.elk.com","xff":"10.0.0.1","referer":"-","status":"304"}' >> /usr/local/nginx/logs/access_json.log

#黑龙江公网IP
[root@elkstack03 conf.d]# echo '{"@timestamp":"2019-04-11T20:46:24 08:00","host":" 123.164.0.18","clientip":"123.164.0.18","size":0,"responsetime":0.000,"upstreamtime":"-","upstreamhost":"-","http_host":"www.elk.com","url":"/index.html","domain":"www.elk.com","xff":"10.0.0.1","referer":"-","status":"304"}' >> /usr/local/nginx/logs/access_json.log

验证Kibana中的数据

打开浏览器,访问:http://10.0.0.54:5601

北京公网IP

海南公网IP

吉林公网IP

黑龙江公网IP

配置Kibana使用地图

Kibana画中国地图

如图:报错:"No Compatible Fields: The "[blog.driverzeng.com -]YYYY.MM.DD" index pattern does not contain any of the following field types: geo_point"

原因:索引格式为[blog.driverzeng.com -]YYYY-MM的日志文件由logstash输出到Elasticsearch;在elasticsearch中,所有的数据都有一个类型,什么样的类型,就可以在其上做一些对应类型的特殊操作。geo信息中的location字段是经纬度,我们需要使用经纬度来定位地理位置;在elasticsearch中,对于经纬度来说,要想使用elasticsearch提供的地理位置查询相关的功能,就需要构造一个结构,并且将其类型属性设置为geo_point,此错误明显是由于我们的geolocation字段类型不是geo_point

我们可以通过以下方式验证一下:

代码语言:javascript复制
[root@elkstack01 ~]# curl -XGET http://10.0.0.51:9200/blog.driverzeng.com-2019.04.11/_mapping/
{"blog.driverzeng.com-2019.04.11":{"mappings":{"blog.driverzeng.com":{"properties":{"@timestamp":{"type":"date"},"@version":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"beat":{"properties":{"hostname":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"version":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"clientip":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"domain":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"geoip":{"properties":{"city_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"continent_code":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"coordinates":{"type":"float"},"country_code2":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"country_code3":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"country_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"ip":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"latitude":{"type":"float"},"location":{"type":"float"},"longitude":{"type":"float"},"region_code":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"region_name":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"timezone":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}},"host":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"http_host":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"input_type":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"offset":{"type":"long"},"referer":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"responsetime":{"type":"float"},"size":{"type":"long"},"source":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"status":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"type":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"upstreamhost":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"upstreamtime":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"url":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}},"xff":{"type":"text","fields":{"keyword":{"type":"keyword","ignore_above":256}}}}}}}}

其中"location":{"type":"float"},",字段类型是float,而不是geo_point,因此会报图中的错误。

解决方法:Elasticsearch支持给索引预定义设置和mapping(前提是你用的 elasticsearch 版本支持这个API,不过估计应该都支持)。其实ES中已经有一个默认预定义的模板,我们只要使用预定的模板即可,那为什么还会报错呢?因为默认预定义的模板必须只有匹配 logstash-* 的索引才会应用这个模板,由于我们在logstash中使用的是[blog.driverzeng.com -]YYYY.MM.DD索引方式,因此不会匹配到默认模板,我们只需要改一下索引方式即可:

代码语言:javascript复制
input {
  redis {
    host => "10.0.0.54"
    port => "6379"
    db => "3"
    key => "all"
    data_type => "list"
    password => "zls"
 }
}

filter {
        json {
            source => "message"
            remove_field => ["message"]
        }
        geoip {
                source => "clientip"
                target => "geoip"
                database => "/etc/logstash/GeoLite2-City_20190409/GeoLite2-City.mmdb"
                add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
                add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
        }
    mutate {
      convert => [ "[geoip][coordinates]", "float"]
        }
}

output {
    elasticsearch {
      hosts => ["10.0.0.51:9200"]
      index => "logstash-%{type}-%{ YYYY.MM.dd}"
    }
}

将输出到ES的索引: index => "%{type}-%{ YYYY.MM.dd}" 改为: index => "logstash-%{type}-%{ YYYY.MM.dd}"

重启Logstash,登录Kibana刷新即可。

代码语言:javascript复制
[root@elkstack03 conf.d]# /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/redis_es_ip.conf &

再次查看Kibana

继续画图

也可以根据自己喜好,画成热力图

保存,可以放入Dashboard

0 人点赞