A common Logstash output sends processed data into Elasticsearch. This tight integration feeds the search and analytics engine with structured, enriched events.
Defining an Elasticsearch output in the pipeline configuration sets the index and connection parameters. Authentication and TLS options ensure secure and authorized data ingestion.
Routing Logstash outputs to Elasticsearch streamlines end-to-end data flow and prepares logs for easy exploration in Kibana.
Steps to output Logstash events to Elasticsearch:
- Open the pipeline configuration file.
$ sudo nano /etc/logstash/conf.d/output_es.conf (no direct output)
Keep outputs in separate files or combine them with input/filter sections.
- Add an output block specifying the Elasticsearch host and index name.
Use index patterns like logs-%{+YYYY.MM.dd} for time-based indices.
- Include credentials if X-Pack Security is enabled.
- Test the configuration.
$ sudo /usr/share/logstash/bin/logstash --path.config /etc/logstash/conf.d --config.test_and_exit Configuration OK
- Restart Logstash.
$ sudo systemctl restart logstash (no output)
Check indices in Elasticsearch and verify that expected documents are indexed.
- Confirm data in Kibana by exploring the corresponding index pattern.
Direct output to Elasticsearch simplifies downstream analysis and visualization.

Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
Comment anonymously. Login not required.