Become Confluent Certified with updated CCDAK exam questions and correct answers
You want to sink data from a Kafka topic to S3 using Kafka Connect. There are 10 brokers in the cluster, the topic has 2 partitions with replication factor of 3. How many tasks will you configure for the S3 connector?
When auto.create.topics.enable is set to true in Kafka configuration, what are the circumstances under which a Kafka broker automatically creates a topic? (select three)
while (true) {
ConsumerRecords< String, String > records = consumer.poll(100);
try {
consumer.commitSync();
} catch (CommitFailedException e) {
log.error('commit failed', e)
}
for (ConsumerRecord< String, String > record records)
{
System.out.printf('topic = %s, partition = %s, offset =
%d, customer = %s, country = %s
',
record.topic(), record.partition(),
record.offset(), record.key(), record.value());
}
}
What kind of delivery guarantee this consumer offers?
A producer is sending messages with null key to a topic with 6 partitions using the DefaultPartitioner. Where will the messages be stored?
© Copyrights DumpsCertify 2026. All Rights Reserved
We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsCertify.