This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-06-17
Channels
- # announcements (4)
- # babashka (22)
- # beginners (50)
- # biff (2)
- # calva (4)
- # cider (18)
- # clj-kondo (12)
- # cljs-dev (3)
- # clojars (2)
- # clojure (66)
- # clojure-austin (1)
- # clojure-belgium (11)
- # clojure-europe (90)
- # clojure-hungary (5)
- # clojure-norway (6)
- # clojure-switzerland (1)
- # clojure-uk (4)
- # clojurescript (19)
- # datascript (2)
- # datomic (41)
- # fulcro (4)
- # gratitude (2)
- # helix (20)
- # jackdaw (2)
- # jobs (9)
- # jobs-discuss (38)
- # kaocha (1)
- # minecraft (8)
- # off-topic (11)
- # polylith (21)
- # rdf (2)
- # remote-jobs (1)
- # sci (1)
- # shadow-cljs (12)
- # specter (7)
- # tools-deps (16)
Hey all 🙂 Does Jackdaw have support for recursively defined Avro schemas? There is an example in the Avro spec of a linked list:
{
"type": "record",
"name": "LongList",
"aliases": ["LinkedLongs"], // old name for this
"fields" : [
{"name": "value", "type": "long"}, // each element has a long
{"name": "next", "type": ["null", "LongList"]} // optional next element
]
}
If I try something similar I get a stack overflow. I have hacked together a quick repro into the jackdaw.serdes.avro-test
for the record type, I will fork and open an issue I think.In a similar vein, is there any option to swap out the Avro schema handling for the standard Apache/Confluent/Interop equivalent, or is it integral to Jackdaw’s operation?