我正在尝试使用logstash将csv文件中的数据提供给弹性搜索 . 我的logsatsh配置文件如下所示:
input {
file {
path => "C:\Users\shreya\Data\RetailData.csv"
start_position => "beginning"
#sincedb_path => "C:\Users\shreya\null"
}
}
filter {
csv {
separator => ","
id => "Store_ID"
columns => ["Store","Date","Temperature","Fuel_Price", "MarkDown1", "MarkDown2", "MarkDown3", "MarkDown4", "CPI", "Unemployment", "IsHoliday"]
}
mutate {convert => ["Store", "integer"]}
mutate {convert => ["Date", "date"]}
mutate {convert => ["Temperature", "float"]}
mutate {convert => ["Fuel_Price", "float"]}
mutate {convert => ["CPI", "float"]}
mutate {convert => ["Unemployment", "float"]}
}
output {
elasticsearch {
action => "index"
hosts => "localhost:9200"
index => "store"
document_type => "store_retail"
}
stdout {}
#stdout {
# codec => rubydebug
#}
}
但我收到一个错误,无法找到解决问题的方法 . 我是logstash的新手 . 我的错误日志如下所示:
[2017-12-02T15:56:38,150] [INFO] [logstash.modules.scaffold]初始化模块{:module_name => "fb_apache",:directory => "C:/Users/shreya/logstash-6.0.0/modules/fb_apache/configuration"} [2017-12-02T15:56:38,165] [INFO] [logstash.modules.scaffold]初始化模块{:module_name => "netflow" ,: directory => "C:/Users/shreya/logstash-6.0.0/modules/netflow/configuration"} [2017-12-02T15:56:38,243] [WARN] [logstash.config.source.multilocal]忽略'pipelines.yml'文件,因为模块或命令行选项已指定[2017-12-02T15:56:39,117] [INFO] [logstash.agent]已成功启动Logstash API endpoints {:port => 9600} [2017-12-02T15:56:42,965] [警告] [logstash.outputs.elasticsearch]您正在使用在elasticsearch中设置的已弃用的配置设置"document_type" . 不推荐使用的设置将继续有效,但计划将来从logstash中删除 . 文档类型在Elasticsearch 6.0中已弃用,并在7.0中完全删除 . 您应该避免使用此功能如果您对此有任何疑问,请访问freenode irc上的#logstash Channels . {:name => "document_type" ,: plugin => "index",hosts => ["localhost:9200"],index => "store",document_type => "store_retail",id => "91a4406a13e9377abb312acf5f6be8e609a685f9c84a5906af957e956119798c">} [2017-12-02T15:56:43,804] [INFO] [logstash.outputs.elasticsearch] Elasticsearch池URL已更新{:changes => {:removed => [],:added => [http://localhost:9200/]}} [2017-12-02T15:56:43,804] [INFO] [logstash.outputs .elasticsearch]运行运行状况检查以查看Elasticsearch连接是否正常工作{:healthcheck_url => http://localhost:9200/,:path => "/"} [2017-12-02T15:56:43,854] [WARN] [logstash.outputs.elasticsearch]已恢复的连接到ES实例:{:URL =>” http://localhost:9200/ "} [2017-12-02T15:56:43,932][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2017-12-02T15:56:43,933][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"模板"=>" logstash-", "版本"=>60001, "设置"=>{" index.refresh_interval "=>" 5S "}, "映射"=>{" default "=>{" dynamic_templates "=>[{" message_field "=>{" path_match "=>"消息", " match_mapping_type "=>"串", "映射"=>{"类型"=>"文本", "规范"=>false}}}, {" string_fields "=>{" match "=>" ", " match_mapping_type "=>"串", "映射"=>{"类型"=>"文本", "规范"=>false, "字段"=>{"关键字"=>{"类型"=>"关键字", " ignore_above "=>256}}}}}], "性质"=>{" @timestamp "=>{"类型"=>"日期"}, " @version "=>{"类型"=>"关键字"}, " geoip的"=>{"动态"=>true, "特性"=>{" IP "=>{"类型"=>" IP "}, "位置"=>{"类型"=>" geo_point "}, "纬度"=>{"类型"=>" half_float "}, "经度"=>{"类型"=>" half_float "}}}}}}}} [2017-12-02T15:56:43,964][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>" LogStash :: Outputs :: ElasticSearch ", :hosts=>[" // localhost:9200 "]} [2017-12-02T15:56:44,011][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>" main ", :plugin=>"#>,@ metric_events_time = org.jruby.proxy.org.logstash.instrument .metrics.counter.LongCounter $ Proxy2 - 命名空间:[stats,pipelines,main,plugins,filters,e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb,events] key:duration_in_millis value:0,@ id = \ _ "e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb",@ klass = LogStash :: Filters :: Mutate, @ metric_events =#,@ structured_lookup_mutex =#,@ fast_lookup =#>>>,@ namespace_na我= [:统计信息,:管道,:主,:插件,:过滤器,:e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb,:事件]>,@filter = {\ "Date" => \ "date"},ID => \ "e3501f879986420bd95a59d8a1c006d9bc4351a481c96bd5366e7edb54bc6fbb",enable_metric =>真,periodic_flush => false >> ", :error=>"缺少翻译:en.logstash.agent.configuration.invalid_plugin_register ", :thread=>"#"} [2017-12-02T15:56:44,042][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>" main ", :exception=>#, :backtrace=>[" C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/logstash- filter-mutate-3.1.6 / lib / logstash / filters / mutate.rb:186:in block in register'", "org/jruby/RubyHash.java:1343:in
each'", " C:/Users/shreya/logstash-6.0.0/vendor/bundle/jruby/2.3.0/gems/ logstash-filter-mutate-3.1.6 / lib / logstash / filters / mutate.rb:184:in register'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:388:in
register_plugin'", " C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline . rb:399:在 block in register_plugins'", "org/jruby/RubyArray.java:1734:in
每个'", " C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:399:in register_plugins'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:801:in
maybe_setup_out_plugins'", " C:/ Users / shreya / logstash- 6.0.0 / logstash-core / lib / logstash / pipeline.rb:409:在 start_workers'", "C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:333:in
中运行' 732312 C:/Users/shreya/logstash-6.0.0/logstash-core/lib/logstash/pipeline.rb:293:in `block in start' "], :thread=>"# "} [2017-12-02T15:56:44,058][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"无法执行操作:LogStash :: PipelineAction :: Create / pipeline_id:main,action_result:false“,:backtrace => nil}
2 回答
问题来自其中一个mutate过滤器中的转换目标 . 来自documentation:
所以这部分导致了崩溃:
如果要将String转换为日期,则必须使用日期过滤器 .
使用以下命令验证配置文件,该命令显示错误详细信息 .