Integrate Filebeat with Kafka


Kafka can consume messages published by Filebeat based on configuration  filebeat.yml file for Kafka Output.

Filebeat Kafka Output Configuration

Filebeat.yml  required below fields to connect and publish message to Kafka for configured topic. Kafka will create Topics dynamically based on filebeat requirement.

output.kafka:
#The list of Kafka broker addresses from where to fetch the cluster metadata.
#The cluster metadata contain the actual Kafka brokers events are published to.
hosts: <strong>["localhost:9092"]</strong>

# The Kafka topic used for produced events. The setting can be a format string
topic: <strong>Topic-Name</strong>

# Authentication details. Password is required if username is set.
#username: ''
#password: ''

For more information about filebeat Kafka Output  configuration option refers below Links.

Read More on Kafka

Integration

Integrate Filebeat, Kafka, Logstash, Elasticsearch and Kibana

About Saurabh Gupta

My Name is Saurabh Gupta, I have approx. 10 Year of experience in Information Technology World manly in Java/J2EE. During this time I have worked with multiple organization with different client, so many technology, frameworks etc.
This entry was posted in Filebeat, Kafka and tagged , , , , , , . Bookmark the permalink.