Question

I'm starting out to collect logs with logstash. The current setup consist of a Java server using logback as logging mechanism and logstash-logback-encoder, outputting the data in a neat JSON representation. The basics work just fine.

I would like to separate additional data in JSON format in separate fields (so each key of the JSON ends up in its own field). logstash-logback-encoder provides a mechanism for that to output such data in a json_mesage field. However this JSON string is placed into a JSON array. See here a sample formatted for better reading.

{
"@timestamp":"2014-03-25T19:34:11.586+01:00",
"@version":1,
"message":"Message{\"activeSessions\":0}",
"logger_name":"metric.SessionMetrics",
"thread_name":"scheduler-2",
"level":"INFO",
"level_value":20000,
"HOSTNAME":"stage-01",
"json_message":["{\"activeSessions\":0}"],
"tags":[]
}

I tried to parse the incoming JSON using a simple JSON filter. See here my configuration:

input {
  lumberjack {
    <snipped>
    codec => "json"
  }
}
filter {
  json {
    source => "json_message"
  }
 }
 output {
   elasticsearch {
     <snipped>
   }
 }

However this leads to following error in the logstash log. The JSON string in an array simply can't be handled.

{:timestamp=>"2014-03-25T19:43:13.232000+0100", 
 :message=>"Trouble parsing json", 
 :source=>"json_message", 
 :raw=>["{\"activeSessions\":0}"], 
 :exception=>#<TypeError: can't convert Array into String>, 
 :level=>:warn}

Is there a way to extract the JSON string from the array prior to parsing? Any help is greatly appreciated. Thanks.

Était-ce utile?

La solution

Actually, it is quite simple and plays along the lines of common programming languages. Though, I did not find the answer in the docs.

Just add an index to the field in the filter:

filter {
  json {
    source => "json_message[0]"
  }
}
Licencié sous: CC-BY-SA avec attribution
Non affilié à StackOverflow
scroll top