Using an API key for Logstash output authentication avoids embedding a full Elasticsearch username and password in pipeline configuration. A scoped key limits blast radius if the pipeline configuration is exposed, and revocation or rotation can be done without changing user credentials.
The elasticsearch output plugin authenticates over HTTP(S) by sending an Authorization: ApiKey header derived from the configured api_key value. API keys are created in Elasticsearch via the /_security/api_key/ endpoint, and embedded role descriptors define the cluster and index privileges available to that key during bulk indexing.
API keys are secrets and require the same handling as passwords when stored under /etc/logstash or other configuration paths. The key privileges must match the index naming scheme (such as logs-*) and include index creation rights when indices are auto-created, otherwise bulk requests will fail with authorization errors. TLS-enabled deployments require a trusted CA chain to the cluster endpoint, and the Logstash output expects the key in id:api_key form rather than the pre-encoded variant used by some clients.
Steps to use an Elasticsearch API key with Logstash output:
- Create an API key scoped to the target index pattern.
$ curl -s --cacert /etc/logstash/certs/es-http-ca.crt -u elastic:password -H "Content-Type: application/json" -X POST "https://node-01-secure:9200/_security/api_key?pretty" -d '{ "name": "logstash-api-key-write", "role_descriptors": { "logstash_writer": { "cluster": ["monitor"], "index": [ { "names": ["logstash-api-key-*"], "privileges": ["auto_configure", "create_index", "create_doc", "write"] } ] } } }' { "id" : "zbjFmpsBI3y-BZEPZqfd", "name" : "logstash-api-key-write", "api_key" : "1IEDtK2ZTUuNfPXR2f_MpQ", "encoded" : "emJqRm1wc0JJM3ktQlpFUFpxZmQ6MUlFRHRLMlpUVXVOZlBYUjJmX01wUQ==" }A real password on the command line can leak via shell history and process listings.
The Logstash api_key value uses id:api_key (example: 9T6qXH4B:b2c3...); the encoded value is the Base64 form commonly used in an Authorization: ApiKey header.
Related: How to create Elasticsearch API keys
- Add the API key to the elasticsearch output block in the Logstash pipeline configuration.
output { elasticsearch { hosts => ["https://node-01-secure:9200"] cacert => "/etc/logstash/certs/es-http-ca.crt" api_key => "zbjFmpsBI3y-BZEPZqfd:1IEDtK2ZTUuNfPXR2f_MpQ" index => "logstash-api-key-%{+YYYY.MM.dd}" } }Prefer storing the key in the Logstash keystore and referencing it via an environment-style substitution instead of embedding the secret directly in a pipeline file.
- Test the Logstash pipeline configuration for syntax errors.
$ sudo -u logstash /usr/share/logstash/bin/logstash --path.settings /etc/logstash --path.data /tmp/logstash-configtest --config.test_and_exit Configuration OK
- Restart the Logstash service to apply the updated pipeline.
$ sudo systemctl restart logstash
Restarting Logstash can briefly pause ingestion while pipelines stop and start.
- Verify documents are arriving in Elasticsearch under the expected index name.
$ curl -s --cacert /etc/logstash/certs/es-http-ca.crt -u elastic:password "https://node-01-secure:9200/logstash-api-key-*/_count?pretty" { "count" : 2, "_shards" : { "total" : 1, "successful" : 1, "skipped" : 0, "failed" : 0 } }The API key used by Logstash can be write-only; use a credential with read privileges for this verification query.
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
