Fork me on GitHub

The way avro works in the confluent ecosystem is that an identifier representing the schema is registered with the schema registry and embedded in the messages (by the producer's avro serializer) Consumers then use the identifier to lookup the exact schema used by the producer and use that to deserialize the remainder of the message so that's why the schema you're providing is being ignored.


@nicolas.estrada938 I think the problem is that you need to pass along the serde options to consumer constructor rather than the call to subscribe.

Nicolas Estrada14:12:14

Yes I finally got it to work by passing in the :key-serde and :value-serde to the consumers... however I fail to grasp how one can subscribe to multiple topics with multiple value avro schemas with only one consumer. Thanks for your time 🙂


Well, yeah, the jackdaw avro serde API makes that a bit tricky. I'd probably just use confluent's generic avro serde and then unpack the GenericData using Not the most efficient (it incurs a JSON serialization/deserialization) but plays a bit nicer with the kafka/confluent ecosystem imo.

Nicolas Estrada23:12:24

I checked out your repo and I guess it's a sensible tradeoff between performance and maintainability (ie. not having to update your codec alongside the JSON one). However have you considered using instead of clojure/data.json? I'm pretty sure this can help for serializing performance especially


Oh yeah, I'd definitely accept a PR that switched to a better json impl 🙂


I just had a quick shot at working on this and it seems that tools-deps has changed in incompatible ways since the last time I touched the project. The build still passes in circleci but that's because it uses a specific version of tools.deps. Will try to get it working again with the latest version.


Avro can be such an hassle (I maintain a Rust crate for Avro with Schema Registry).


(i'm guilty of writing large chunks of each 😂)

Nicolas Estrada14:12:04

Cool I'll check it out. Thanks! 🙂