首页 文章

elasticsearch - 使用logstash日期导入csv不会解析为datetime类型

提问于
浏览
1

我正在尝试使用logstash将csv导入elasticsearch我尝试过两种方法:

  • 使用CSV

  • 使用grok过滤器

1) For csv below is my logstash file:

input {
  file {
    path => "path_to_my_csv.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  csv {
        separator => ","
        columns => ["col1","col2_datetime"]
  }
  mutate {convert => [ "col1", "float" ]}
  date {
        locale => "en"
        match => ["col2_datetime", "ISO8601"] // tried this one also - match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
        timezone => "Asia/Kolkata"
        target => "@timestamp" // tried this one also - target => "col2_datetime"
   }
}
output {
   elasticsearch {
     hosts => "http://localhost:9200"
     index => "my_collection"

  }
  stdout {}
}

2) Using grok filter:

对于grok过滤器,下面是我的logstash文件

input {
  file {
    path => "path_to_my_csv.csv"
    start_position => "beginning"
    sincedb_path => "/dev/null"
  }
}
filter {
  grok {
    match => { "message" => "(?<col1>(?:%{BASE10NUM})),(%{TIMESTAMP_ISO8601:col2_datetime})"}
    remove_field => [ "message" ]
  }
  date {
        match => ["col2_datetime", "yyyy-MM-dd HH:mm:ss"]
   }
}
output {
   elasticsearch {
     hosts => "http://localhost:9200"
     index => "my_collection_grok"

  }
  stdout {}
}

PROBLEM:

因此,当我单独运行这两个文件时,我能够在elasticsearch中导入数据 . 但我的日期字段没有解析为datetime类型,而是保存为字符串,因此我无法运行日期过滤器 .

所以有人可以帮我弄清楚它为什么会发生 . 我的弹性搜索版本是5.4.1 .

提前致谢

1 回答

  • 0

    我对配置文件进行了2次更改 .

    1)删除列名col2_datetime中的under_score

    2)添加目标

    这是我的配置文件的样子......

    vi logstash.conf
    
    input {
      file {
        path => "/config-dir/path_to_my_csv.csv"
        start_position => "beginning"
        sincedb_path => "/dev/null"
      }
    }
    filter {
      csv {
            separator => ","
            columns => ["col1","col2"]
      }
      mutate {convert => [ "col1", "float" ]}
      date {
            locale => "en"
            match => ["col2",  "yyyy-MM-dd HH:mm:ss"]
            target => "col2"
       }
    }
    output {
       elasticsearch {
         hosts => "http://172.17.0.1:9200"
         index => "my_collection"
    
      }
      stdout {}
    }
    

    这是数据文件:

    vi path_to_my_csv.csv
    
    1234365,2016-12-02 19:00:52 
    1234368,2016-12-02 15:02:02 
    1234369,2016-12-02 15:02:07
    

相关问题