Weekend Sale - 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dm70dm

CCDAK Confluent Certified Developer for Apache Kafka Certification Examination Questions and Answers

Questions 4

You need to correctly join data from two Kafka topics.

Which two scenarios will allow for co-partitioning?

(Select two.)

Options:

A.

Both topics have the same number of partitions.

B.

Both topics have the same key and partitioning strategy.

C.

Both topics have the same value schema.

D.

Both topics have the same retention time.

Buy Now
Questions 5

You need to collect logs from a host and write them to a Kafka topic named 'logs-topic'. You decide to use Kafka Connect File Source connector for this task.

What is the preferred deployment mode for this connector?

Options:

A.

Standalone mode

B.

Distributed mode

C.

Parallel mode

D.

SingleCluster mode

Buy Now
Questions 6

You have a topic with four partitions. The application reads from it using two consumers in a single consumer group.

Processing is CPU-bound, and lag is increasing.

What should you do?

Options:

A.

Add more consumers to increase the level of parallelism of the processing.

B.

Add more partitions to the topic to increase the level of parallelism of the processing.

C.

Increase the max.poll.records property of consumers.

D.

Decrease the max.poll.records property of consumers.

Buy Now
Questions 7

Your application is consuming from a topic configured with a deserializer.

It needs to be resilient to badly formatted records ("poison pills"). You surround the poll() call with a try/catch for RecordDeserializationException.

You need to log the bad record, skip it, and continue processing.

Which action should you take in the catch block?

Options:

A.

Log the bad record, no other action needed.

B.

Log the bad record and seek the consumer to the offset of the next record.

C.

Log the bad record and call the consumer.skip() method.

D.

Throw a runtime exception to trigger a restart of the application.

Buy Now
Questions 8

You need to consume messages from Kafka using the command-line interface (CLI).

Which command should you use?

Options:

A.

kafka-console-consumer

B.

kafka-consumer

C.

kafka-get-messages

D.

kafka-consume

Buy Now
Questions 9

A stream processing application is consuming from a topic with five partitions. You run three instances of the application. Each instance has num.stream.threads=5.

You need to identify the number of stream tasks that will be created and how many will actively consume messages from the input topic.

Options:

A.

5 created, 1 actively consuming

B.

5 created, 5 actively consuming

C.

15 created, 5 actively consuming

D.

15 created, 15 actively consuming

Buy Now
Questions 10

Which is true about topic compaction?

Options:

A.

When a client produces a new event with an existing key, the old value is overwritten with the new value in the compacted log segment.

B.

When a client produces a new event with an existing key, the broker immediately deletes the offset of the existing event.

C.

Topic compaction does not remove old events; instead, when clients consume events from a compacted topic, they store events in a hashmap that maintains the latest value.

D.

Compaction will keep exactly one message per key after compaction of inactive log segments.

Buy Now
Questions 11

An application is consuming messages from Kafka.

The application logs show that partitions are frequently being reassigned within the consumer group.

Which two factors may be contributing to this?

(Select two.)

Options:

A.

There is a slow consumer processing application.

B.

The number of partitions does not match the number of application instances.

C.

There is a storage issue on the broker.

D.

An instance of the application is crashing and being restarted.

Buy Now
Questions 12

You create a topic named stream-logs with:

    A replication factor of 3

    Four partitions

    Messages that are plain logs without a keyHow will messages be distributed across partitions?

Options:

A.

The first message will always be written to partition 0.

B.

Messages will be distributed round-robin among all the topic partitions.

C.

All messages will be written to the same log segment.

D.

Messages will be distributed among all the topic partitions with strict ordering.

Buy Now
Questions 13

A stream processing application is tracking user activity in online shopping carts.

You want to identify periods of user inactivity.

Which type of Kafka Streams window should you use?

Options:

A.

Sliding

B.

Tumbling

C.

Hopping

D.

Session

Buy Now
Questions 14

Clients that connect to a Kafka cluster are required to specify one or more brokers in the bootstrap.servers parameter.

What is the primary advantage of specifying more than one broker?

Options:

A.

It provides redundancy in making the initial connection to the Kafka cluster.

B.

It forces clients to enumerate every single broker in the cluster.

C.

It is the mechanism to distribute a topic’s partitions across multiple brokers.

D.

It provides the ability to wake up dormant brokers.

Buy Now
Questions 15

Your Kafka cluster has five brokers. The topic t1 on the cluster has:

    Two partitions

    Replication factor = 4

    min.insync.replicas = 3You need strong durability guarantees for messages written to topic t1.You configure a producer acks=all and all the replicas for t1 are in-sync.How many brokers need to acknowledge a message before it is considered committed?

Options:

A.

2

B.

3

C.

4

D.

5

Buy Now
Questions 16

Match the topic configuration setting with the reason the setting affects topic durability.

(You are given settings like unclean.leader.election.enable=false, replication.factor, min.insync.replicas=2)

CCDAK Question 16

Options:

Buy Now
Questions 17

You create a producer that writes messages about bank account transactions from tens of thousands of different customers into a topic.

    Your consumers must process these messages with low latency and minimize consumer lag

    Processing takes ~6x longer than producing

    Transactions for each bank account must be processedin orderWhich strategy should you use?

Options:

A.

Use the timestamp of the message's arrival as its key.

B.

Use the bank account number found in the message as the message key.

C.

Use a combination of the bank account number and the transaction timestamp as the message key.

D.

Use a unique identifier such as a universally unique identifier (UUID) as the message key.

Buy Now
Questions 18

Match each configuration parameter with the correct deployment step in installing a Kafka connector.

CCDAK Question 18

Options:

Buy Now
Exam Code: CCDAK
Exam Name: Confluent Certified Developer for Apache Kafka Certification Examination
Last Update: Jul 12, 2025
Questions: 61

PDF + Testing Engine

$49.5  $164.99

Testing Engine

$37.5  $124.99
buy now CCDAK testing engine

PDF (Q&A)

$31.5  $104.99
buy now CCDAK pdf
dumpsmate guaranteed to pass
24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 13 Jul 2025