Fork me on GitHub
#jackdaw
<
2020-09-24
>
bringe20:09:52

Hi, I'm trying to set auto.register.schemas to false for my kafka producer, but I'm having trouble figuring out where that setting is supposed to go. If I set it in my producer config, I get this warning in the logs:

WARN org.apache.kafka.clients.producer.ProducerConfig - The configuration 'auto.register.schemas' was supplied but isn't a known config.

bringe20:09:10

And I verified that schemas are being auto registered in this case

gklijs20:09:17

It’s the correct place, the notion is just annoying/misleading.

bringe20:09:38

But I checked that the schemas are in fact being auto registered. I made a new topic with no schema set and produced to it with my producer and it registered the schema, which I want to not happen.

bringe20:09:51

So it seems this config value is ignored

gklijs20:09:15

You might also need to set AbstractKafkaSchemaSerDeConfig.USE_LATEST_VERSION to true. Which is a bit weird, but how they implemented it.

bringe20:09:30

Ohh right, okay

gklijs20:09:36

It doesn’t make much sense to me, because, of course I want to use the latest available schema if I’m not registering one, like which other one should be used.. Spend a lot of time before finding that one out..

bringe20:09:39

Yeah, seems like this should be handled if auto register is false

bringe20:09:33

This is my config

(def producer-config
  {ProducerConfig/BOOTSTRAP_SERVERS_CONFIG "localhost:9092"
   ProducerConfig/ENABLE_IDEMPOTENCE_CONFIG true
   ProducerConfig/CLIENT_ID_CONFIG "commander"
   AbstractKafkaAvroSerDeConfig/AUTO_REGISTER_SCHEMAS false})
Which is passed to jackdaw.client/producer

gklijs20:09:30

Is jackdaw by default using the Avro serializers? At least with the base java client you also need ProducerConfig/KEY_SERIALIZER_CLASS_CONFIG ProducerConfig/VALUE_SERIALIZER_CLASS_CONFIG and AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG

bringe20:09:31

I'm not sure, but I use serde-resolver like this

(def resolve-serde (serde-resolver :schema-registry-url ""))

(defstate test-producer
  "For testing sending data using an avro serde."
  :start (jc/producer producer-config
                      {:key-serde (string-serde)
                       :value-serde (resolve-serde
                                     {:serde-keyword :jackdaw.serdes.avro.confluent/serde
                                      :key? false
                                      :schema-filename "test-value-v1.json"})})
  :stop (.close test-producer))

bringe20:09:37

And that seems to work

bringe20:09:53

So, I see that static field you mentioned, AbstractKafkaSchemaSerDeConfig/SCHEMA_REGISTRY_URL_CONFIG, but when I try to import it, it sayss does not exist o_O

bringe20:09:02

(:import [org.apache.kafka.clients.producer ProducerConfig]
           [io.confluent.kafka.serializers AbstractKafkaAvroSerDeConfig AbstractKafkaSchemaSerDeConfig])

bringe20:09:10

java.lang.ClassNotFoundException: io.confluent.kafka.serializers.AbstractKafkaSchemaSerDeConfig

bringe20:09:49

I'm on confluent platform 5.5.1 :thinking_face:

bringe20:09:03

I can just use the string, but that's odd

gklijs20:09:04

Example is using confluent platform 5.5.1, so maybe you need to check if the resolved dependency is really 5.5.1.. And 6.0.0 could be released any day now, or when it’s done.

👍 3
bringe01:09:18

Thanks for the help