Elasticsearch Logstash Kibana 笔记
Elastic Stack 是适用于数据采集、充实、存储、分析和可视化的一组开源工具。人们通常将 Elastic Stack 称为 ELK Stack(代指 Elasticsearch、Logstash 和 Kibana)
Elasticsearch 是一个分布式的开源搜索和分析引擎,适用于所有类型的数据,包括文本、数字、地理空间、结构化和非结构化数据。 类似于MySQL,用来存储和分析数据。
Logstash可用来对数据进行聚合和处理,并将数据发送到 Elasticsearch。Logstash 是一个开源的服务器端数据处理管道,允许您在将数据索引到 Elasticsearch 之前同时从多个来源采集数据,并对数据进行充实和转换。 类似于程序,抽取解析数据并储存到ES
Kibana 是一款适用于 Elasticsearch 的数据可视化和管理工具。类似于使用MySQL时用的Navicat
简单的说,LogStash用来收取解析日志并发消息写入ES,Elasticsearch用来存储和分析查询,Kibana用来查看
es下载地址 https://www.elastic.co/cn/downloads/elasticsearch
logstash下载地址 https://www.elastic.co/cn/downloads/logstash
kibana下载地址 https://www.elastic.co/cn/downloads/kibana
filebeats
elasticsearch-7.x 要求JDK版本大于等于11
ZBMAC-C02PGMT0F:~ weikeqin1$ java -version
java version "14" 2020-03-17
Java(TM) SE Runtime Environment (build 14+36-1461)
Java HotSpot(TM) 64-Bit Server VM (build 14+36-1461, mixed mode, sharing)
ZBMAC-C02PGMT0F:~ weikeqin1$
elasticsearch安装配置启动
下载
wget https://artifacts.elastic.co/downloads/elasticsearch/elasticsearch-7.7.0-linux-x86_64.tar.gz
tar -xzf elasticsearch-7.7.0-linux-x86_64.tar.gz
cd elasticsearch-7.7.0/
配置
修改
elasticsearch-7.7.0/config/elasticsearch.yml
action.destructive_requires_name: true
xpack.ml.enabled: true
启动
cd elasticsearch-7.7.0/
./bin/elasticsearch
后台启动可以用
./bin/elasticsearch -d
验证是否启动成功
在浏览器里输入 http://localhost:9200/?pretty 或者用
curl 'http://localhost:9200/?pretty'
{
"name": "192.168.0.110",
"cluster_name": "elasticsearch",
"cluster_uuid": "aHsnglvYQvSrJeoFR70mCQ",
"version": {
"number": "7.7.0",
"build_flavor": "default",
"build_type": "tar",
"build_hash": "81a1e9eda8e6183f5237786246f6dced26a10eaf",
"build_date": "2020-05-16T02:01:37.602180Z",
"build_snapshot": false,
"lucene_version": "8.5.1",
"minimum_wire_compatibility_version": "6.8.0",
"minimum_index_compatibility_version": "6.0.0-beta1"
},
"tagline": "You Know, for Search"
}
看到类似以上结果说明es启动成功。
kibana安装配置启动
下载
wget https://artifacts.elastic.co/downloads/kibana/kibana-7.7.0-linux-x86_64.tar.gz
tar -zxf kibana-7.7.0-linux-x86_64.tar.gz
cd kibana-7.7.0-linux-x86_64
配置
修改
kibana-7.7.0-darwin-x86_64/config/kibana.yml
# Kibana is served by a back end server. This setting specifies the port to use.
server.port: 5601
# Specifies the address to which the Kibana server will bind. IP addresses and host names are both valid values.
# The default is 'localhost', which usually means remote machines will not be able to connect.
# To allow connections from remote users, set this parameter to a non-loopback address.
server.host: "localhost"
# The URLs of the Elasticsearch instances to use for all your queries.
elasticsearch.hosts: ["http://localhost:9200"]
启动
cd kibana-7.7.0-linux-x86_64/
./bin/kinana
后台运行可以用
./bin/kinana &
验证是否启动成功
浏览器里输入 http://127.0.0.1:5601/ 如果可以打开页面说明启动成功
logstash安装配置启动
下载
wget https://artifacts.elastic.co/downloads/logstash/logstash-7.7.0.tar.gz
tar -zxf logstash-7.7.0.tar.gz
cd logstash-7.7.0
配置
在 config 目录下新建配置
logstash-wkq-test.conf
# Sample Logstash configuration for creating a simple
# file -> Logstash -> Elasticsearch pipeline.
input {
file {
path => [ "/Users/weikeqin1/WorkSpaces/java/springboot-test/logs/entrance.log", "/Users/weikeqin1/WorkSpaces/java/springboot-test/logs/*.log" ]
type => "entrance"
start_position => "beginning"
}
file {
path => [ "/Users/weikeqin1/WorkSpaces/java/springboot-test/logs/error*.log" ]
type => "error"
start_position => "beginning"
}
file {
path => "/var/log/apache/access.log"
type => "apache"
start_position => "beginning"
}
}
filter {
if [path] =~ "entrance" {
mutate { replace => { "type" => "entrance_log" } }
grok {
match => { "msg" => "%{COMBINEDAPACHELOG}" }
}
} else if [path] =~ "error" {
mutate { replace => { type => "error_log" } }
}
date {
match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss" ]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "log-%{+YYYY.MM.dd}"
#user => "elastic"
#password => "changeme"
}
}
配置文件中的 grok 对应解析日志的字段
日志样例
2020-05-17 11:14:22.887 INFO [XNIO-1 task-9] cn.wkq.controller.TestController.test|{"_traceId":"1589685262886CY74","_method":"test","param":{"appKey":"wkq"}}
解析规则
%{DATA:logDate} %{DATA:logTime} %{DATA:logLevel} \[%{DATA:thread}\]%{DATA:classnameMethod}\|\{%{DATA:msg}\}
%{DATA:logDate} %{DATA:logTime} %{DATA:logLevel} \[%{DATA:thread}\] %{DATA:classnameMethod}\|\{\"_traceId\":\"%{DATA:traceId}\"\,\"_method\":\"%{DATA:method}\",\"param\"\:\{%{DATA:param}\}
可以在kibana http://127.0.0.1:5601/app/kibana#/dev_tools/grokdebugger 调试
启动
./bin/logstash -f ./config/logstash-wkq-test.conf
测试
用压测工具压测程序,会生成很多日志,压测的时候看监控,程序才占用75M内存,logstash就占1000m内存,实际使用500m左右,elasticsearch用的默认配置,也占1000m内存,实际使用750m左右
logstash太占内存了
filebeat
下载安装
curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.7.0-linux-x86_64.tar.gz
tar xzvf filebeat-7.7.0-linux-x86_64.tar.gz
配置
#=========================== Filebeat inputs =============================
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
- /Users/weikeqin1/WorkSpaces/java/springboot-test/logs/*.log
#- c:\programdata\elasticsearch\logs\*
#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
# Array of hosts to connect to.
#hosts: ["localhost:9200"]
# Protocol - either `http` (default) or `https`.
#protocol: "https"
# Authentication credentials - either API key or username/password.
#api_key: "id:api_key"
#username: "elastic"
#password: "changeme"
#----------------------------- Logstash output --------------------------------
output.logstash:
# The Logstash hosts
hosts: ["localhost:5044"]
# Optional SSL. By default is off.
# List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]
# Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"
# Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"
启动
cd filebeat-7.7.0-linux-x86_64
sudo ./filebeat -e
在kibana查看日志
{
"_index" : "filebeat-7.7.0-2020.05.17",
"_type" : "_doc",
"_id" : "y42gIHIBU4Je81Q8Jaun",
"_score" : null,
"_source" : {
"log" : {
"file" : {
"path" : "/Users/weikeqin1/WorkSpaces/java/springboot-test/logs/springboot-test.2020-05-17_0.log"
},
"offset" : 371432
},
"tags" : [
"beats_input_codec_plain_applied"
],
"host" : {
"architecture" : "x86_64",
"id" : "DAA105AB-C93E-5E37-AFBF-A747032048AF",
"ip" : [
"xx::xx:xx:xx:xx",
"192.168.0.110",
"xx::xx:xx:xx:xx",
"xx::xx:xx:xx:xx"
],
"name" : "192.168.0.110",
"mac" : [
"xx:xx:xx:xx:xx:xx",
"xx:xx:xx:xx:xx:xx",
"xx:xx:xx:xx:xx:xx",
"xx:xx:xx:xx:xx:xx",
"xx:xx:xx:xx:xx:xx",
"xx:xx:xx:xx:xx:xx"
],
"os" : {
"version" : "10.13.6",
"build" : "17G8030",
"kernel" : "17.7.0",
"name" : "Mac OS X",
"family" : "darwin",
"platform" : "darwin"
},
"hostname" : "192.168.0.110"
},
"ecs" : {
"version" : "1.5.0"
},
"agent" : {
"id" : "1dcf526f-dc97-4a59-bccc-4751954962e6",
"type" : "filebeat",
"version" : "7.7.0",
"ephemeral_id" : "810f6da6-b38a-4071-8d2f-9c45dee14440",
"hostname" : "192.168.0.110"
},
"@version" : "1",
"@timestamp" : "2020-05-17T03:14:23.865Z",
"input" : {
"type" : "log"
},
"message" : """2020-05-17 11:14:22.879 [http-nio-11011-exec-20] INFO [cn.wkq.controller.TestController] [32] - {"_traceId":"15896852628794XZ3","_method":"TestController#test#1589685262879#"}"""
},
"sort" : [
1589685263865
]
}
References
[1] elastic官网
[2] elasticsearch官网中文简介
[3] elasticsearch官网logstash中文简介
[4] elasticsearch官网kibana中文简介
[5] elasticsearch下载安装启动及配置
[6] logstash-configuration
[7] kibana-getting-started
[8] beats-getting-started
[9] logstash-patterns