-
Notifications
You must be signed in to change notification settings - Fork 908
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot receive data with Kafka #54
Comments
Hi, glad to hear you're enjoying the image. Here's something that works on my side. Disclaimer: Kafka newbie here. My
As long as In the
I then built the
So far, business as usual. Now, at this point, starting the container (command below, but don't do it now),
The funny thing is that – as you pointed out – Logstash doesn't complain about anything (even when increasing log verbosity by starting the container with env var Anyway, after tearing my hair out for a bit and some searching around, I found this: http://stackoverflow.com/questions/30606447/kafka-consumer-fetching-metadata-for-topics-failed So I added this line to Kafka's
Where From that point, running a Kafka consumer from within the container worked:
I then deleted the container and started a clean container from the image that was created previously:
(The first line puts the IP address of the host's Docker interface in the At this point still nothing apparent in Logstash… but behind the scenes, something definitely happened, as the
So Logstash's Kafka input plugin is reading the events (a trivial Interestingly, sending non-JSON-formatted events to Kafka generates an error in Logstash's logs, shows up in Elasticsearch's index, and is visible in Kibana:
Browsing to http://X.X.X.X:9200/_search?pretty (Elasticsearch) shows:
At this point, I'm confident that the extended ELK image is behaving properly and playing nicely with ZK and Kafka, but I don't know what kind of input/configuration it would need for Logstash to pass along the events it retrieves from Kafka to Elasticsearch. Again, not a Kafka expert so can't really help more than that, but the above might help you investigate further. Alternatively, perhaps the Elastic community (https://discuss.elastic.co/) will be able to help you work out how to configure Logstash to process your event data from Kafka. If you do manage to figure it out, I'd be most interested if you could drop a line here to let me know what you did. Edit: words. |
Started from a fresh VM + container + instance of ZK/Kafka with the config from my previous comment, it turns out that everything is actually working properly… provided that proper JSON is fed into Kafka (i.e. using Anyway, still no logs from Logstash when everything's OK. Elasticsearch creates the index as needed. Logs displayed by the ELK container show up as:
Browsing to http://X.X.X.X:9200/_search?pretty (Elasticsearch) shows:
So looks good to me, let me know how it goes on your side. |
Thank you very much for your effort! After I changed my KAFKA_ADVERTISED_HOST_NAME from localhost to 192.168.2.101 Logstash was able to receive data that was a non-json string, the index was created and the messages available in Kibana. When I send json data, I got an error , but after cleaning up the containers everything is fine, my data is available in ES. |
Great to hear that, thanks for the update. |
I don't see any KAFKA_ADVERTISED_HOST_NAME property. Is it equivalent of bootstrap_servers property? |
@raiusa see #70 (comment) |
I had a similar problem with kafka-input-plugin for logstash 7.4.0. There is no such field in the name zk_connect for this version. Instead, to connect with my kafka server, I was using I ended up this very thread after some search. On searching further, I have got the answer that I needed the most. Changing my config to Note: This is specific to development with Docker Desktop for Mac. PS: I have created a github repo that contains all the files that I used to accomplish pushing data from a local kafka queue to elk-stack installed with sebp/elk. Can be helpful.. |
Hi, I'm using this container, great work btw, to receive JSON data from Kafka. For that, I installed the kafka-input-plugin as mentioned in the docs, and the plugin is registered as I can see with
logstash-plugin list
.My config file looks like this:
and is added with
ADD ./kafka-input.conf /etc/logstash/conf.d/kafka-input.conf
in the Dockerfile.My Kafka setup is fine, because I can send and receive data with other applications. But anything in my ELK setup seems to be wrong, because I cannot receive any data. There's neither any output from Logstash in the console nor any data in Kibana, because there is no logstash index created, which should be the default behavior according to the plugin docs.
The zk_connect is correct too, because otherwise I get exceptions ...
Any ideas?
Thanks in advance!
The text was updated successfully, but these errors were encountered: