Multiple patterns in one log

You could use multiple patterns for your grok filter, grok { match => [“fieldname”, “pattern1”, “pattern2”, …, “patternN”] } and they will be applied in order but a) it’s not the best option performance-wise and b) you probably want to treat different types of logs differently anyway, so I suggest you use conditionals based on … Read more

multiple inputs on logstash jdbc

You can definitely have a single config with multiple jdbc input and then parametrize the index and document_type in your elasticsearch output depending on which table the event is coming from. input { jdbc { jdbc_driver_library => “/Users/logstash/mysql-connector-java-5.1.39-bin.jar” jdbc_driver_class => “com.mysql.jdbc.Driver” jdbc_connection_string => “jdbc:mysql://localhost:3306/database_name” jdbc_user => “root” jdbc_password => “password” schedule => “* * * … Read more

Change default mapping of string to “not analyzed” in Elasticsearch

Just create a template. run curl -XPUT localhost:9200/_template/template_1 -d ‘{ “template”: “*”, “settings”: { “index.refresh_interval”: “5s” }, “mappings”: { “_default_”: { “_all”: { “enabled”: true }, “dynamic_templates”: [ { “string_fields”: { “match”: “*”, “match_mapping_type”: “string”, “mapping”: { “index”: “not_analyzed”, “omit_norms”: true, “type”: “string” } } } ], “properties”: { “@version”: { “type”: “string”, “index”: “not_analyzed” … Read more

How can a Elasticsearch client be notified of a new indexed document?

This is what you’re looking for: https://github.com/ForgeRock/es-change-feed-plugin Using this plugin, you can register to a websocket channel to receive indexation/deletion events as they happen. It has some limitations, though. Back in the days, it was possible to install river plugins to stream documents to ES. The river feature has been removed, but this plugin above … Read more