This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2016-12-21
Channels
- # beginners (201)
- # boot (125)
- # cider (3)
- # cljs-dev (21)
- # cljsrn (165)
- # clojars (8)
- # clojure (332)
- # clojure-belgium (1)
- # clojure-gamedev (8)
- # clojure-russia (75)
- # clojure-spec (25)
- # clojure-uk (96)
- # clojurebridge (2)
- # clojurescript (130)
- # code-reviews (16)
- # cursive (26)
- # datomic (20)
- # devops (6)
- # emacs (6)
- # hoplon (90)
- # jobs (9)
- # luminus (2)
- # off-topic (4)
- # om (65)
- # onyx (5)
- # pedestal (4)
- # protorepl (6)
- # re-frame (34)
- # reagent (12)
- # ring (4)
- # ring-swagger (7)
- # specter (2)
- # test-check (8)
- # untangled (2)
- # vim (1)
- # yada (6)
I sent a malformed (truncated) json on kafka and onyx-kafka retried aggressively for what seemed forever. Is it the user responsibility to never fail deserializing a segment? Can I configure the plugin to skip malformed inputs?
@yonatanel: yeah, that is the user's responsibility. You can either put a lifecycle in that will kill the job if it crashes with a deserialisation exception or catch exceptions from the deserialisation exception and drop the message but it's up to you to decide how to handle it
I have a relatively basic question. How does :done
work within a workflow? The general idea is that it causes the flow to terminate, but how exactly does that work when flow predicates and multiple outputs are involved?
Or does :done
really only apply to take-segments!
and core.async
?
It’s used as a signal to the job that the inputs have been fully read. Once each of the inputs has read a signal, they all finish flushing their segments until all of them are acked. Once all of the inputs have completely flushed/acked, the job is considered completed and it stops.