首页 文章

Logstash csv导入

提问于
浏览
1

我正在使用Ubuntu 14.04 LTS,Kibana,Logstash和Elasticsearch . 我尝试使用以下代码将我的csv文件导入LogStash,但它没有检测到 .

input 
{
    file 
    {
        path => "/home/kibana/Downloads/FL_insurance_sample.csv"
        type => "FL_insurance_sample.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter 
{
    csv 
    {
    columns => ["policyID","statecode","country","eq_site_limit","hu_site_limit",
        "fl_sitelimit","fr_site_limit","tiv_2011","tiv_2012","eq_site_deductible",
        "hu_site_deductible","fl_site_deductible","fr_site_deductible","point_latitude",
        "point_longtitude","line","construction","point_granularity"]
        separator => ","
    }
}

output 
{
    elasticsearch {
        action => "index"
        host => "localhost"
        index => "promosms-%{+dd.MM.YYYY}"
        workers => 1
    }
    stdout
    {
        codec => rubydebug
    }

}

我甚至做到了

sudo service logstash restart

当我进入Kibana GUI界面的索引映射时,我选择了Logstash- *并找不到我想要的数据 . 附:我的配置文件存储在/etc/logstash/conf.d/simple.conf中

1 回答

  • 0

    在你的问题中,你说你去了Kibana的 Logstash-* ,但你的配置文件说你正在将数据放入 promosms-%{+dd.MM.YYYY} .

    您需要进入kibana4的设置部分并将 [promosms-]DD.MM.YYYY 放入索引名称或模式框中并检查"index contains time-based events"和"Use event times to create index names" .

    然后,您可能还希望将其设置为默认索引 .

相关问题