Replies: 2 comments 2 replies
-
Hello @mack-cope, Can you set the log level to DEBUG in your application and put theses logs here please ? Thanks |
Beta Was this translation helpful? Give feedback.
1 reply
-
Hi @mack-cope, Could you activate librdkafka debug logs using this property : config.Debug = "generic,broker,metadata,security,admin,consumer,cgrp,topic" Please put these logs in the ticket. I suspect missed configuration connecting to your cluster. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello,
My team has created a stream processor that we have validated locally, but when we try to run it on our non-prod kafka cluster, we see no errors, but also no output. We have observed the following:
Local Environment
Non-Prod Environment
We have given the application the acl permissions we believe are necessary as specified here. It should be able to create internal topics based on those permissions.
Any ideas on what we might be missing? Has anyone seen this type of issue?
Thanks!
Logs:
log4net: Configuration update mode [Merge].
log4net: Logger [root] Level string is [DEBUG].
log4net: Logger [root] level set to [name="DEBUG",value=30000].
log4net: Loading Appender [ConsoleAppender] type: [log4net.Appender.ConsoleAppender]
log4net: Setting Property [Threshold] to Level value [DEBUG]
log4net: Converter [message] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [newline] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Setting Property [ConversionPattern] to String value [%d [%t] %-5p %c - %m%n]
log4net: Converter [d] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [literal] Option [ [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [t] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [literal] Option [] ] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [p] Option [] Format [min=5,max=2147483647,leftAlign=True]
log4net: Converter [literal] Option [ ] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [c] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [literal] Option [ - ] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [m] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Converter [n] Option [] Format [min=-1,max=2147483647,leftAlign=False]
log4net: Setting Property [Layout] to object [log4net.Layout.PatternLayout]
log4net: Created Appender [ConsoleAppender]
log4net: Adding appender named [ConsoleAppender] to logger [root].
log4net: Hierarchy Threshold []
2022-03-18 14:12:40 DEBUG - Hosting starting
2022-03-18 14:12:40 DEBUG - Starting Application...
2022-03-18 14:12:40 DEBUG - Validating topics...
2022-03-18 14:12:40 DEBUG - Starting Stream Processor.
2022-03-18 14:12:40,981 [1] INFO Streamiz.Kafka.Net.KafkaStream - stream-application[event-data-stream-processor-kafka-dotnet] Start creation of the stream application with this configuration:
Stream property:
client.id: efw.dev.eventdatakstreamprocessor
num.stream.threads: 1
default.key.serdes: Streamiz.Kafka.Net.SerDes.StringSerDes
default.value.serdes: Streamiz.Kafka.Net.SerDes.ByteArraySerDes
default.timestamp.extractor: Streamiz.Kafka.Net.Processors.Internal.FailOnInvalidTimestamp
commit.interval.ms: 30000
processing.guarantee: AT_LEAST_ONCE
transaction.timeout: 00:00:10
poll.ms: 100
max.poll.records: 500
max.poll.restoring.records: 1000
max.task.idle.ms: 0
buffered.records.per.partition: 1000
follow.metadata: False
state.dir: /tmp/streamiz-kafka-net
replication.factor: 1
windowstore.changelog.additional.retention.ms: 86400000
offset.checkpoint.manager:
application.id: event-data-stream-processor-kafka-dotnet
Client property:
bootstrap.servers: bootstrapserver:9092
debug: consumer
security.protocol: ssl
ssl.ca.location: /etc/kafkaclientcert/cacert.crt
ssl.certificate.location: /etc/kafkaclientcert/kafkaclient.pem
ssl.key.location: /etc/kafkaclientcert/kafkaclient.key
ssl.key.password: ********
Consumer property:
max.poll.interval.ms: 300000
enable.auto.commit: False
enable.auto.offset.store: False
partition.assignment.strategy: cooperative-sticky
auto.offset.reset: earliest
Producer property:
None
Admin client property:
None
2022-03-18 14:12:41,316 [1] INFO Streamiz.Kafka.Net.Processors.StreamThread - stream-thread[event-data-stream-processor-kafka-dotnet-stream-thread-0] Creating shared producer client
2022-03-18 14:12:41,412 [1] INFO Streamiz.Kafka.Net.Processors.StreamThread - stream-thread[event-data-stream-processor-kafka-dotnet-stream-thread-0] Creating consumer client
2022-03-18 14:12:41,438 [.NET ThreadPool Worker] INFO Streamiz.Kafka.Net.KafkaStream - stream-application[event-data-stream-processor-kafka-dotnet] State transition from CREATED to REBALANCING
2022-03-18 14:12:41,482 [.NET ThreadPool Worker] INFO Streamiz.Kafka.Net.KafkaStream - stream-application[event-data-stream-processor-kafka-dotnet] Starting Streams client with this topology : Topologies:
Sub-topology: 0
Source: KSTREAM-SOURCE-0000000000 (topics: [salesforce.dev.productiondetails.event])
--> KSTREAM-PEEK-0000000001
Processor: KSTREAM-PEEK-0000000001 (stores: [])
--> KSTREAM-SINK-0000000002
<-- KSTREAM-SOURCE-0000000000
Sink: KSTREAM-SINK-0000000002 (topic: businesstopic.eventdata.dev.sem.enterprise.aggregate)
<-- KSTREAM-PEEK-0000000001
2022-03-18 14:12:42 INFORMATION - Application started. Press Ctrl+C to shut down.
2022-03-18 14:12:42 INFORMATION - Hosting environment: Production
2022-03-18 14:12:42 INFORMATION - Content root path: /app/KafkaStreamsDotnetEventDataStreamProcessor/
2022-03-18 14:12:42 DEBUG - Hosting started
Beta Was this translation helpful? Give feedback.
All reactions