Logstash can take input from Kafka to parse data and send parsed output to Kafka for streaming to other Application.
Kafka Input Configuration in Logstash
Below are basic configuration for Logstash to consume messages from Logstash. For more information about Logstash, Kafka Input configuration refer this elasticsearch site Link
input { kafka { bootstrap_servers => 'KafkaServer:9092' topics => ["TopicName"] codec => json {} } }
bootstrap_servers : Default value is “localhost:9092”. Here it takes list of all servers connections in the form of host1:port1,host2:port2 to establish the initial connection to the cluster. It will connect with other if one server is down.
topics: List of topics to subscribe from where it will consume messages.
Kafka Output Configuration in Logstash
Below are basic configuration for Logstash to publish messages to Logstash. For more information about Logstash, Kafka Output configuration refer this elasticsearch site Link
output { kafka { bootstrap_servers => "localhost:9092" topic_id => 'TopicName' } }
bootstrap_servers : Default value is “localhost:9092”. Here it takes list of all servers connections in the form of host1:port1,host2:port2 and producer will only use it for getting metadata(topics, partitions and replicas) .The socket connections for sending the actual data will be established based on the broker information returned in the metadata.
topic_id: Topic name where messages will publish.
Read More on Kafka
- Kafka Introduction and Architecture
- Kafka Server Properties Configuration
- Setup Kafka Cluster for Single Server/Broker
- Setup Kafka Cluster for Multi/Distributed Servers/Brokers
- Integrate Java with Kafka
- Integrate Filebeat with Kafka
- Kafka and Zookeeper Common Issues
Integration
Integrate Filebeat, Kafka, Logstash, Elasticsearch and Kibana
6 thoughts on “Logstash Input and Output to/from Kafka Example”
You must log in to post a comment.