This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-06-06
Channels
- # aleph (15)
- # beginners (40)
- # boot (14)
- # cider (90)
- # cljs-dev (132)
- # cljsrn (25)
- # clojars (7)
- # clojure (188)
- # clojure-chicago (4)
- # clojure-dusseldorf (1)
- # clojure-greece (9)
- # clojure-italy (43)
- # clojure-russia (16)
- # clojure-sg (7)
- # clojure-spec (39)
- # clojure-uk (81)
- # clojurescript (170)
- # component (5)
- # core-async (7)
- # cursive (49)
- # data-science (65)
- # datascript (3)
- # datomic (27)
- # graphql (3)
- # hoplon (4)
- # instaparse (56)
- # klipse (129)
- # leiningen (1)
- # lumo (28)
- # off-topic (4)
- # om (15)
- # onyx (54)
- # overtone (7)
- # pedestal (7)
- # re-frame (9)
- # reagent (72)
- # ring (33)
- # ring-swagger (2)
- # spacemacs (1)
- # untangled (19)
- # vim (2)
- # yada (12)
anyone know it it is possible to pass a schema as a fn param for a task? or a record? {... :my/param Foo :onyx/params [:my/param]}
I keep getting an java.lang.UnsupportedOperationException: nth not supported on this type: Symbol
I didn't see any limitations in the docs, but maybe I'm looking in the wrong place
Yes, I’d have thought that would work. What does your function look like?
(defn validate-segment [schema segment] (assoc segment ::valid? (boolean (not (s/check schema segment))))
where schema would be Foo in the example I posted above
that looks right
what’s the stack trace look like
it might be coming from somewhere I don’t expect it
I'm not getting a stack trace, this is being thrown in from some tests and thats the only ouput I'm getting from cider-test-run-test
. I haven't really starting digging into it, though I just wanted to make sure I wasn't chasing something that definitely wouldn't work before I started digging in.
Sure. I don’t see why it wouldn’t work, but I might be missing something
can you try setting :my/param
to 3 and then checking whether the 3 flows through?
strange though, if I change the param to a scalar, everything works. only when I pass in a schema def or record does it fail. I took a peek at the code that builds the partial and I don't see anything obvious that would prevent that from working
ha, yeah I did try that and it does work.
very weird
I'll do some more substantial debugging and post anything surprising in here
thanks for the help
no worries, good luck.
maybe it just serialized your Foo schema as a symbol
rather than serializing the schema (which may not even work via a task map)
hm. given the error I'm getting that makes sense.
hmm i'm trying to figure out why the zookeeper instance that onyx is using is constantly failing with OOM errors -- it appears as if onyx is putting quite a lot of state in zookeeper.
what kind of requirements can I expect ? i'm currently -Xmx
ing zookeeper to 8GB, and apparently that's not enough
(before I dive into the GC logs a bit deeper to figure out the state of the java heap before it crashes, I'm trying to figure out what exactly would be reasonable and what not)
We've never given zookeeper instances more than a few gigs. Are your job definitions very big/contain a lot of data? Are you using the zookeeper state backend or s3?
Right, that would make a lot of sense then. There's no gc mechanism for state currently, and the zookeeper state backend is really only intended for minimal state or test environments
@lmergen Another thing to consider is that if you’re running in a container, the JVM defaults to 1/4 of the host memory, not the container memory limit.
@gardnervickers yeah i'm aware of that, that's why i'm running -Xms and -Xmx
Ah cool ok, just making sure as it’s been the cause of problems for quite a few people in the past.
i do believe in the theory that this is probably ABS snapshotting large window functions state into zookeeper
@lmergen That should do the trick. We’re stashing fairly large windows in S3 in production.
I have a question about error handling in Onyx
My job has three different tasks. If an error occurs in any of those tasks, I want to send the error message to an error task that I have
Do I need to have mappings in :workflow
for each mapping from my tasks to my error task?
@stephenmhopper yes, you’ll need edges from each of your tasks to the error tasks, then you can use flow conditions to selectively emit to them
The flow condition map looks something like this:
{:flow/from source-task
:flow/to [:error-task]
:flow/predicate ::constantly-true
:flow/short-circuit? true
:flow/thrown-exception? true}
Right now, I have the mappings defined in the :workflow
, but all of the messages are being handled by my error task. Do I need to update the predicate function to check and see if it’s an error before I do anything?
Yeah, currently you need one flow condition to route to the error task, which checks whether it’s an exception
and then you need another to route to the other task the rest of the time
Unfortunately flow conditions does not make this as easy as it should be. We have an outstanding issue to improve it.
hm, okay, it looks like it’s still not quite working
I’ve updated the :flow/predicate
to be ::handle-error?
where handle-error?
is:
(defn handle-error? [event old-segment ex-obj all-new]
(instance? java.lang.Exception ex-obj))
But all of the segments are getting through regardless of what that function returns. Any ideas on what I’m missing?
ah, okay, I added the other flow-condition thing like you mentioned (basically (complement handle-error?)
) and now it works
Cool. Glad to hear it. It’s not so intuitive - we could do a lot better here.
Yeah, thanks for the assistance. So, going forward, should I just plan on making two flow conditions any time I need flow conditions--1 for doing the thing I want, and the other for doing the complement of the thing I want?
For now, yes. We pulled it into a function that will add to our flow conditions
Cool, thank you!