Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fluentd-plugin-kafka supoort kerberos? #220

Open
openchung opened this issue Oct 16, 2018 · 14 comments
Open

fluentd-plugin-kafka supoort kerberos? #220

openchung opened this issue Oct 16, 2018 · 14 comments
Labels

Comments

@openchung
Copy link

openchung commented Oct 16, 2018

I use fluentd to send json log to sasl_ssl cloudera kafka , but I meet the following warn. So it cause send failed. I have verified my keytab and principal using kinit to verify .

2018-10-17 07:45:10 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2018-10-17 07:47:30 +0800 error_class="GSSAPI::GssApiError" error="gss_init_sec_context did not return GSS_S_COMPLETE" plugin_id="object:3f84a1db517c"
2018-10-17 07:45:10 +0800 [warn]: suppressed same stacktrace
2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Send exception occurred: gss_init_sec_context did not return GSS_S_COMPLETE"}
2018-10-17 07:45:10 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/gssapi-1.2.0/lib/gssapi/simple.rb:95:in init_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:72:in initialize_gssapi_context'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl/gssapi.rb:25:in authenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/sasl_authenticator.rb:51:in authenticate!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/connection_builder.rb:27:in build_connection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:184:in connection'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:170:in send_request'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/broker.rb:44:in fetch_metadata'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:386:in block in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:in each'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:381:in fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:367:in cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:95:in refresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/cluster.rb:50:in add_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:276:in deliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:238:in block in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in call'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/instrumenter.rb:23:in instrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.6.8/lib/kafka/producer.rb:231:in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:281:in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.7.9/lib/fluent/plugin/out_kafka_buffered.rb:344:in write'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in write_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in pop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in try_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'"}
2018-10-17 07:45:10 +0800 fluent.info: {"message":"initialized kafka producer: kafka"}
2018-10-17 07:45:10 +0800 fluent.warn: {"next_retry":"2018-10-17 07:47:30 +0800","error_class":"GSSAPI::GssApiError","error":"gss_init_sec_context did not return GSS_S_COMPLETE","plugin_id":"object:3f84a1db517c","message":"temporarily failed to flush the buffer. next_retry=2018-10-17 07:47:30 +0800 error_class="GSSAPI::GssApiError" error="gss_init_sec_context did not return GSS_S_COMPLETE" plugin_id="object:3f84a1db517c""}

Following is my td-agent.conf

<system>
  log_level debug
</system>
<source>
  @type tail
  format json
  read_lines_limit 200
  path /mnt/old/tx-failover.log*
  pos_file /opt/kafka_failover.log.pos
  tag audit.trail
</source>
<match audit.*>
  @type kafka_buffered
  brokers 192.168.5.129:9093
  client_id kafka
  buffer_type memory
  default_topic log4j
  output_data_type json
  ssl_ca_cert /opt/server.cer.pem
  #sasl__mechanism gssapi
  principal [email protected]
  keytab /opt/kafka.keytab
</match>
<match **>
  @type stdout
  output_type json
</match>

Following is my dependency
actionmailer (4.2.8)
actionpack (4.2.8)
actionview (4.2.8)
activejob (4.2.8)
activemodel (4.2.8)
activerecord (4.2.8)
activesupport (4.2.8)
addressable (2.5.2, 2.5.1)
arel (6.0.4)
aws-sdk (2.10.45)
aws-sdk-core (2.10.45)
aws-sdk-resources (2.10.45)
aws-sigv4 (1.0.2)
base91 (0.0.1)
bigdecimal (default: 1.2.4)
bson (4.1.1)
builder (3.2.3)
bundler (1.14.5)
celluloid (0.15.2)
cool.io (1.5.1)
diff-lcs (1.3)
draper (1.4.0)
elasticsearch (5.0.5)
elasticsearch-api (5.0.5)
elasticsearch-transport (5.0.5)
erubis (2.7.0)
excon (0.62.0)
faraday (0.13.1)
ffi (1.9.25)
fluent-logger (0.7.1)
fluent-mixin-plaintextformatter (0.2.6)
fluent-plugin-elasticsearch (1.17.2)
fluent-plugin-genhashvalue (0.04)
fluent-plugin-kafka (0.7.9, 0.6.1)
fluent-plugin-mongo (0.8.1)
fluent-plugin-rewrite-tag-filter (1.5.6)
fluent-plugin-s3 (0.8.5)
fluent-plugin-scribe (0.10.14)
fluent-plugin-td (0.10.29)
fluent-plugin-td-monitoring (0.2.3)
fluent-plugin-webhdfs (0.7.1)
fluentd (0.12.40)
fluentd-ui (0.4.4)
font-awesome-rails (4.7.0.1)
globalid (0.4.0)
gssapi (1.2.0)
haml (4.0.7)
haml-rails (0.5.3)
hike (1.2.3)
hirb (0.7.3)
http_parser.rb (0.6.0)
httpclient (2.8.2.4)
i18n (0.8.1)
io-console (default: 0.4.3)
ipaddress (0.8.3)
jbuilder (2.6.3)
jmespath (1.3.1)
jquery-rails (3.1.4)
json (default: 1.8.1)
kramdown (1.13.2)
kramdown-haml (0.0.3)
loofah (2.0.3)
ltsv (0.1.0)
mail (2.6.4)
mime-types (3.1)
mime-types-data (3.2016.0521)
mini_portile2 (2.3.0, 2.1.0)
minitest (5.10.1, default: 4.7.5)
mixlib-cli (1.7.0)
mixlib-config (2.2.4)
mixlib-log (1.7.1)
mixlib-shellout (2.2.7)
mongo (2.2.7)
msgpack (1.1.0)
multi_json (1.12.1)
multipart-post (2.0.0)
nokogiri (1.8.1)
ohai (6.20.0)
oj (2.18.1)
parallel (1.8.0)
psych (default: 2.0.5)
public_suffix (3.0.0, 2.0.5)
puma (3.8.2)
rack (1.6.5)
rack-test (0.6.3)
rails (4.2.8)
rails-deprecated_sanitizer (1.0.3)
rails-dom-testing (1.0.8)
rails-html-sanitizer (1.0.3)
railties (4.2.8)
rake (default: 10.1.0)
rdoc (default: 4.1.0)
request_store (1.3.2)
ruby-kafka (0.6.8)
ruby-progressbar (1.8.3)
rubyzip (1.2.1, 1.1.7)
sass (3.2.19)
sass-rails (4.0.5)
settingslogic (2.0.9)
sigdump (0.2.4)
sprockets (2.12.4)
sprockets-rails (2.3.3)
string-scrub (0.0.5)
sucker_punch (1.0.5)
systemu (2.5.2)
td (0.15.2)
td-client (0.8.85)
td-logger (0.3.27)
test-unit (default: 2.1.10.0)
thor (0.19.4)
thread_safe (0.3.6)
thrift (0.8.0)
tilt (1.4.1)
timers (1.1.0)
tzinfo (1.2.3)
tzinfo-data (1.2017.2)
uuidtools (2.1.5)
webhdfs (0.8.0)
yajl-ruby (1.3.0)
zip-zip (0.3)

Please help me.

@openchung
Copy link
Author

openchung commented Oct 21, 2018

update about my issues. I change to version 0.8.0 gem still have error.
Following is the stacks :

`2018-10-21 12:32:38 +0800 [info]: gem 'fluent-mixin-plaintextformatter' version '0.2.6'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-elasticsearch' version '1.17.2'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-genhashvalue' version '0.04'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-kafka' version '0.8.0'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-mongo' version '0.8.1'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-rewrite-tag-filter' version '1.5.6'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-s3' version '0.8.5'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-scribe' version '0.10.14'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-td' version '0.10.29'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-td-monitoring' version '0.2.3'
2018-10-21 12:32:38 +0800 [info]: gem 'fluent-plugin-webhdfs' version '0.7.1'
2018-10-21 12:32:38 +0800 [info]: gem 'fluentd' version '0.12.40'
2018-10-21 12:32:38 +0800 [info]: adding match pattern="audit.*" type="kafka_buffered"
2018-10-21 12:32:38 +0800 [trace]: registered output plugin 'kafka_buffered'
2018-10-21 12:32:38 +0800 [info]: brokers has been set directly: ["192.168.5.129"]
2018-10-21 12:32:38 +0800 [info]: adding match pattern="**" type="stdout"
2018-10-21 12:32:38 +0800 [info]: adding source type="tail"
2018-10-21 12:32:38 +0800 [info]: using configuration file:

log_level trace

@type tail format json read_lines_limit 200 path /mnt/old/tx-failover.log pos_file /opt/kafka_failover.log.pos tag audit.trail @type kafka_buffered brokers 192.168.5.129 client_id kafka buffer_type memory default_topic log4j output_data_type json ssl_ca_cert /opt/ca_cert.pem get_kafka_client_log true principal kafka/[email protected] keytab /opt/kafka.keytab @type stdout output_type json 2018-10-21 12:32:38 +0800 [info]: initialized kafka producer: kafka 2018-10-21 12:32:38 +0800 [info]: following tail of /mnt/old/tx-failover.log 2018-10-21 12:32:49 +0800 [info]: detected rotation of /mnt/old/tx-failover.log; waiting 5 seconds 2018-10-21 12:32:49 +0800 [info]: following tail of /mnt/old/tx-failover.log 2018-10-21 12:32:49 +0800 fluent.info: {"message":"detected rotation of /mnt/old/tx-failover.log; waiting 5 seconds"} 2018-10-21 12:32:49 +0800 fluent.info: {"message":"following tail of /mnt/old/tx-failover.log"} 2018-10-21 12:33:38 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key: and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}. 2018-10-21 12:33:38 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key: and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}. 2018-10-21 12:33:38 +0800 [trace]: message will send to log4j with partition_key: , partition: , message_key: and value: {"widget":{"debug":"on","window":{"title":"Sample Konfabulator Widget","name":"main_window","width":500,"height":500},"image":{"src":"Images/Sun.png","name":"sun1","hOffset":250,"vOffset":250,"alignment":"center"},"text":{"data":"Click Here","size":36,"style":"bold","name":"text1","hOffset":250,"vOffset":100,"alignment":"center","onMouseUp":"sun1.opacity = (sun1.opacity / 100) * 90;"}}}. 2018-10-21 12:33:38 +0800 [debug]: 3 messages send. 2018-10-21 12:33:38 +0800 [info]: New topics added to target list: log4j 2018-10-21 12:33:38 +0800 [info]: Fetching cluster metadata from kafka://192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Opening connection to 192.168.5.129:9092 with client id kafka... 2018-10-21 12:33:38 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key: and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."} 2018-10-21 12:33:38 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key: and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."} 2018-10-21 12:33:38 +0800 fluent.trace: {"message":"message will send to log4j with partition_key: , partition: , message_key: and value: {\"widget\":{\"debug\":\"on\",\"window\":{\"title\":\"Sample Konfabulator Widget\",\"name\":\"main_window\",\"width\":500,\"height\":500},\"image\":{\"src\":\"Images/Sun.png\",\"name\":\"sun1\",\"hOffset\":250,\"vOffset\":250,\"alignment\":\"center\"},\"text\":{\"data\":\"Click Here\",\"size\":36,\"style\":\"bold\",\"name\":\"text1\",\"hOffset\":250,\"vOffset\":100,\"alignment\":\"center\",\"onMouseUp\":\"sun1.opacity = (sun1.opacity / 100) * 90;\"}}}."} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"3 messages send."} 2018-10-21 12:33:38 +0800 fluent.info: {"message":"New topics added to target list: log4j"} 2018-10-21 12:33:38 +0800 fluent.info: {"message":"Fetching cluster metadata from kafka://192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Opening connection to 192.168.5.129:9092 with client id kafka..."} 2018-10-21 12:33:38 +0800 [debug]: Sending sasl_handshake API request 1 to 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Waiting for response 1 from 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Received response 1 from 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: GSSAPI: Initializing context with 192.168.5.129:9092, principal kafka/[email protected] 2018-10-21 12:33:38 +0800 [debug]: Sending topic_metadata API request 2 to 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Waiting for response 2 from 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Closing socket to 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [debug]: Closing socket to 192.168.5.129:9092 2018-10-21 12:33:38 +0800 [error]: Failed to fetch metadata from kafka://192.168.5.129:9092: Connection error EOFError: end of file reached 2018-10-21 12:33:38 +0800 [warn]: Send exception occurred: Could not connect to any of the seed brokers: - kafka://192.168.5.129:9092: Connection error EOFError: end of file reached 2018-10-21 12:33:38 +0800 [warn]: Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:396:in `fetch_cluster_info' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in `cluster_info' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in `refresh_metadata!' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in `add_target_topics' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in `deliver_messages_with_retries' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in `block in deliver_messages' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `call' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `instrument' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in `deliver_messages' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in `deliver_messages' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in `write' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush' /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run' 2018-10-21 12:33:38 +0800 [info]: initialized kafka producer: kafka 2018-10-21 12:33:38 +0800 [warn]: temporarily failed to flush the buffer. next_retry=2018-10-21 12:33:39 +0800 error_class="Kafka::ConnectionError" error="Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached" plugin_id="object:3f8168cb0bc8" 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:396:in `fetch_cluster_info' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in `cluster_info' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in `refresh_metadata!' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in `add_target_topics' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in `deliver_messages_with_retries' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in `block in deliver_messages' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `call' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `instrument' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in `deliver_messages' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in `deliver_messages' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in `write' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush' 2018-10-21 12:33:38 +0800 [warn]: /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run' 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Sending sasl_handshake API request 1 to 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Waiting for response 1 from 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Received response 1 from 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"GSSAPI: Initializing context with 192.168.5.129:9092, principal kafka/[email protected]"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Sending topic_metadata API request 2 to 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Waiting for response 2 from 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.debug: {"message":"Closing socket to 192.168.5.129:9092"} 2018-10-21 12:33:38 +0800 fluent.error: {"message":"Failed to fetch metadata from kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"} 2018-10-21 12:33:38 +0800 fluent.warn: {"message":"Send exception occurred: Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached"} 2018-10-21 12:33:38 +0800 fluent.warn: {"message":"Exception Backtrace : /opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:396:in `fetch_cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:350:in `cluster_info'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:98:in `refresh_metadata!'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/cluster.rb:52:in `add_target_topics'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:357:in `deliver_messages_with_retries'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:247:in `block in deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `call'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/instrumenter.rb:23:in `instrument'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/ruby-kafka-0.7.3/lib/kafka/producer.rb:240:in `deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:285:in `deliver_messages'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluent-plugin-kafka-0.8.0/lib/fluent/plugin/out_kafka_buffered.rb:348:in `write'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:354:in `write_chunk'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/buffer.rb:333:in `pop'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:342:in `try_flush'\n/opt/td-agent/embedded/lib/ruby/gems/2.1.0/gems/fluentd-0.12.40/lib/fluent/output.rb:149:in `run'"} 2018-10-21 12:33:38 +0800 fluent.info: {"message":"initialized kafka producer: kafka"} 2018-10-21 12:33:38 +0800 fluent.warn: {"next_retry":"2018-10-21 12:33:39 +0800","error_class":"Kafka::ConnectionError","error":"Could not connect to any of the seed brokers:\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached","plugin_id":"object:3f8168cb0bc8","message":"temporarily failed to flush the buffer. next_retry=2018-10-21 12:33:39 +0800 error_class=\"Kafka::ConnectionError\" error=\"Could not connect to any of the seed brokers:\\n- kafka://192.168.5.129:9092: Connection error EOFError: end of file reached\" plugin_id=\"object:3f8168cb0bc8\""}`

Following is my local gems
`*** LOCAL GEMS ***

actionmailer (4.2.8)
actionpack (4.2.8)
actionview (4.2.8)
activejob (4.2.8)
activemodel (4.2.8)
activerecord (4.2.8)
activesupport (4.2.8)
addressable (2.5.2, 2.5.1)
arel (6.0.4)
aws-sdk (2.10.45)
aws-sdk-core (2.10.45)
aws-sdk-resources (2.10.45)
aws-sigv4 (1.0.2)
base91 (0.0.1)
bigdecimal (default: 1.2.4)
bson (4.1.1)
builder (3.2.3)
bundler (1.14.5)
celluloid (0.15.2)
cool.io (1.5.1)
diff-lcs (1.3)
digest-crc (0.4.1)
draper (1.4.0)
elasticsearch (5.0.5)
elasticsearch-api (5.0.5)
elasticsearch-transport (5.0.5)
erubis (2.7.0)
excon (0.62.0)
faraday (0.13.1)
ffi (1.9.25)
fluent-logger (0.7.1)
fluent-mixin-plaintextformatter (0.2.6)
fluent-plugin-elasticsearch (1.17.2)
fluent-plugin-genhashvalue (0.04)
fluent-plugin-kafka (0.8.0)
fluent-plugin-mongo (0.8.1)
fluent-plugin-rewrite-tag-filter (1.5.6)
fluent-plugin-s3 (0.8.5)
fluent-plugin-scribe (0.10.14)
fluent-plugin-td (0.10.29)
fluent-plugin-td-monitoring (0.2.3)
fluent-plugin-webhdfs (0.7.1)
fluentd (0.12.40)
fluentd-ui (0.4.4)
font-awesome-rails (4.7.0.1)
globalid (0.4.0)
gssapi (1.2.0)
haml (4.0.7)
haml-rails (0.5.3)
hike (1.2.3)
hirb (0.7.3)
http_parser.rb (0.6.0)
httpclient (2.8.2.4)
i18n (0.8.1)
io-console (default: 0.4.3)
ipaddress (0.8.3)
jbuilder (2.6.3)
jmespath (1.3.1)
jquery-rails (3.1.4)
json (default: 1.8.1)
kramdown (1.13.2)
kramdown-haml (0.0.3)
loofah (2.0.3)
ltsv (0.1.0)
mail (2.6.4)
mime-types (3.1)
mime-types-data (3.2016.0521)
mini_portile2 (2.3.0, 2.1.0)
minitest (5.10.1, default: 4.7.5)
mixlib-cli (1.7.0)
mixlib-config (2.2.4)
mixlib-log (1.7.1)
mixlib-shellout (2.2.7)
mongo (2.2.7)
msgpack (1.1.0)
multi_json (1.12.1)
multipart-post (2.0.0)
nokogiri (1.8.1)
ohai (6.20.0)
oj (2.18.1)
parallel (1.8.0)
psych (default: 2.0.5)
public_suffix (3.0.0, 2.0.5)
puma (3.8.2)
rack (1.6.5)
rack-test (0.6.3)
rails (4.2.8)
rails-deprecated_sanitizer (1.0.3)
rails-dom-testing (1.0.8)
rails-html-sanitizer (1.0.3)
railties (4.2.8)
rake (default: 10.1.0)
rdoc (default: 4.1.0)
request_store (1.3.2)
ruby-kafka (0.7.3)
ruby-progressbar (1.8.3)
rubyzip (1.2.1, 1.1.7)
sass (3.2.19)
sass-rails (4.0.5)
settingslogic (2.0.9)
sigdump (0.2.4)
sprockets (2.12.4)
sprockets-rails (2.3.3)
string-scrub (0.0.5)
sucker_punch (1.0.5)
systemu (2.5.2)
td (0.15.2)
td-client (0.8.85)
td-logger (0.3.27)
test-unit (default: 2.1.10.0)
thor (0.19.4)
thread_safe (0.3.6)
thrift (0.8.0)
tilt (1.4.1)
timers (1.1.0)
tzinfo (1.2.3)
tzinfo-data (1.2017.2)
uuidtools (2.1.5)
webhdfs (0.8.0)
yajl-ruby (1.3.0)
zip-zip (0.3)`

Please help me. Thanks los.

@repeatedly
Copy link
Member

Does anyone know how to fix this problem?
kerberos feature is contributed by user and it depends on ruby-kafka's kerberos support.
So kerberos support should work on basic stetup but I'm not sure what is wrong.

@openchung
Copy link
Author

openchung commented Oct 23, 2018

I have asked the question in ruby-kafka's discussion issues. #670 .We have an urgent need for help.

@mihir2402
Copy link

@openchung I am facing the same issue for Kerberos with GSSAPI. Were you/anyone able to fix it?

@dborysenko
Copy link

Same exact issue in my environment.
Is there success story of working GSSAPI?

@frencopei
Copy link

frencopei commented Mar 26, 2020

same issue in my environment with keytab and principal but not krb5.conf.

i think the config is not completed ,you wont find KDC server addr without krb5.conf. any successful story?

@rymonroe
Copy link

rymonroe commented Jun 3, 2020

I have same issue, I've confirmed keytab/principle with Kinit. Anyone get this working?

@xidiandb
Copy link

i think the config is not completed ,you wont find KDC server addr without krb5.conf. any successful story?

Solved?

@github-actions
Copy link

github-actions bot commented Jul 6, 2021

This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days

@github-actions github-actions bot added the stale label Jul 6, 2021
@kenhys kenhys removed the stale label Jul 7, 2021
@github-actions
Copy link

github-actions bot commented Oct 5, 2021

This issue has been automatically marked as stale because it has been open 90 days with no activity. Remove stale label or comment or this issue will be closed in 30 days

@github-actions github-actions bot added the stale label Oct 5, 2021
@ashie ashie added bug and removed stale labels Oct 5, 2021
@victorpalmeida
Copy link

I have the same problem. I've provided principal and keytab. Also I can validate using kinit that the kyetab and principal are valid, it is returning a Kerberos ticket. Any news or successful story?

@swananddh
Copy link

swananddh commented Dec 9, 2021

@ashie could you please update if this issue is resolved? or if any workaround available for above issue then that will be helpful.

@Thor77
Copy link

Thor77 commented Dec 9, 2021

As long as the issue in the ruby library is not solved (zendesk/ruby-kafka#670) this one can't be solved either :/

@dborysenko
Copy link

I'd suggest you to use rdkafka. It works just fine with Fluentd kerberos authN.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests