首页 文章

logstash无法在ES中创建索引

提问于
浏览
1

我正在尝试使用Logstash解析日志文件 . 文件读取目录中的示例日志,并通过Logstash将其索引到ElasticSearch中 . (通过Filebeat从目录读取输入文件,并指定在Filebeat.yml中读取Logstash作为输出,并在logstash配置文件中解析日志文件,并将结果放入ES中的索引 . )

Filebeat.yml

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

  #input_type: log
  #input_type: log
  document_type: my_log
paths:
  - C:\logsa\elast.log

    #----------------------------- Logstash output --------------------------------
    output.logstash:
      # The Logstash hosts
      hosts: ["localhost:5044"]



elast.log : (I am trying to parse this one line of log in the log file) 

    [2016-11-03 07:30:05,987] [INFO] [o.e.p.PluginsService     ] [hTYKFFt] initializing...

Logstash Configuration file :

input {
beats {
port => "5044"
}
}
filter {
if [type] == "my_log" {
grok {
match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
}

我正在运行filebeat.exe,logstash conf文件和elasticsearch .

运行logstash配置文件时,我没有收到任何错误...

运行logstash conf时的控制台:

C:\logstash-5.0.0\logstash-5.0.0\bin>logstash -f log-h.conf
JAVA_OPTS was set to [ -Xmx1g -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSP
arallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSIniti
atingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:+HeapDumpOnOutO
fMemoryError -XX:HeapDumpPath="$LS_HOME/heapdump.hprof"]. Logstash will trust th
ese options, and not set any defaults that it might usually set
Sending Logstash logs to C:/logstash-5.0.0/logstash-5.0.0
/logs which is now configured via log4j2.properties.
[2016-11-08T17:38:02,452][INFO ][logstash.inputs.beats    ] Beats inputs: Starti
ng input listener {:address=>"0.0.0.0:5044"}
[2016-11-08T17:38:02,728][INFO ][org.logstash.beats.Server] Starting server on p
ort: 5044
[2016-11-08T17:38:03,082][INFO ][logstash.outputs.elasticsearch] Elasticsearch p
ool URLs updated {:changes=>{:removed=>[], :added=>["http://localhost:9200"]}}
[2016-11-08T17:38:03,089][INFO ][logstash.outputs.elasticsearch] Using mapping t
emplate from {:path=>nil}
[2016-11-08T17:38:03,324][INFO ][logstash.outputs.elasticsearch] Attempting to i
nstall template {:manage_template=>{"template"=>"logstash-*", "version"=>50001,
"settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=
>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"pa
th_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text"
, "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"str
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=
>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"
=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"d
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_po
int"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}
}}}}}
[2016-11-08T17:38:03,359][INFO ][logstash.outputs.elasticsearch] New Elasticsear
ch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:9200"
]}
[2016-11-08T17:38:03,596][INFO ][logstash.pipeline        ] Starting pipeline {"
id"=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.
delay"=>5, "pipeline.max_inflight"=>500}
[2016-11-08T17:38:03,612][INFO ][logstash.pipeline        ] Pipeline main starte
d
[2016-11-08T17:38:03,783][INFO ][logstash.agent           ] Successfully started
 Logstash API endpoint {:port=>9600}

它不是在ES中创建索引,也不是在上面的控制台中看到的任何错误 .

有人可以帮忙吗?提前致谢 .

1 回答

  • 0

    Filebeat配置存在一些缩进问题 . 它应该像Filebeat 5.x一样 .

    filebeat.prospectors:
    - paths:
        - C:/logsa/elast.log
      document_type: my_log
    
    output.logstash:
      hosts: ["localhost:5044"]
    

    Beats文档中提供了一个Logstash configuration example,它显示了如何配置Elasticsearch输出 . 这会将数据写入 filebeat-YYYY.MM.DD 索引 .

    input {
      beats {
        port => "5044"
      }   
    }   
    
    filter {
      if [type] == "my_log" {
        grok {
          match => { "message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] \[%{DATA:LOGLEVEL}\] \[%{DATA:MESSAGE}\] \[%{GREEDYDATA:message}\] %{GREEDYDATA:message1}"}
        }   
      }   
    }   
    
    output {
      elasticsearch {
        hosts => "localhost:9200"
        manage_template => false
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
        document_type => "%{[@metadata][type]}"
      }   
    }
    

    使用Logstash时,您还必须将Filebeat索引模板manually install添加到Elasticsearch .

    对于Windows:

    PS C:\Program Files\Filebeat> Invoke-WebRequest -Method Put -InFile filebeat.template.json -Uri http://localhost:9200/_template/filebeat?pretty

    对于Unix:

    curl -XPUT 'http://localhost:9200/_template/filebeat' -d@/etc/filebeat/filebeat.template.json

相关问题