This repository was archived by the owner on Sep 17, 2019. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 101
This repository was archived by the owner on Sep 17, 2019. It is now read-only.
Issue using JSON created field in statement. #151
Copy link
Copy link
Open
Labels
Description
Having this issue on 7.1.1. I have looked at the definition of this flag, event_as_json_keyword, and it looks like this will attempt to do what I have already done with the Json filter in logstash.
My issue: I decode jsons with logstash into a bunch of fields and then would like to send it off to a postgresdb. It seems as though it is not finding these newly generated fields. I have verified this is a valid json and is properly ingested into elasticsearch. As you can see I also am able to use these generated fields to create a new index which is then ingestable by the jdbc output:
mutate { add_field => [ "index_name", "%{[MTE][test_station]}" ] }
JDBC Output error message:
JDBC - Exception. Not retrying {:exception=>#<RuntimeError: Invalid Fi
eldReference: `%{[MTE][serial_number]}`>, :statement=>"IN
My logstash config:
input {
beats {
client_inactivity_timeout => 12000000
port => 5044
}
}
filter {
mutate {
gsub => [
"message", "\n", "",
"message", " ", ""
]
}
json {
source => "message"
}
mutate {
copy => {"[MTE][timestamp]" => "timestamp"}
}
mutate {
gsub => [
"timestamp", "T", ""
]
}
date {
match => ["timestamp", "YYYYMMddHHmmssZ"]
timezone => "UTC"
target => "@timestamp"
}
mutate {
remove_field => "timestamp"
}
mutate {
add_field => [ "index_name", "%{[MTE][test_station]}" ]
}
mutate {
lowercase => "index_name"
}
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
manage_template => true
index => "%{[index_name]}"
template => "/usr/share/logstash/logstash-template.json"
template_name => "mte_dynamic_template"
template_overwrite => true
}
jdbc {
enable_event_as_json_keyword => true
connection_string => 'jdbc:postgresql://postgres:5432/postgres'
username => 'asdf'
password => 'asdf'
#driver_jar_path=>'/usr/share/logstash/postgres.jar'
statement => [ "INSERT INTO mfg_db (timestamp, test_name, serial_number, test_pass) VALUES(CAST (? as timestamp), ?, ?, ?)",
"%{@timestamp}", "%{index_name}", "%{[MTE][serial_number]}" , "%{[MTE][test_pass]}"]
}
}