This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2015-12-04
Channels
- # admin-announcements (6)
- # alda (1)
- # announcements (3)
- # aws (4)
- # beginners (233)
- # boot (82)
- # cider (11)
- # cljsjs (21)
- # cljsrn (7)
- # clojure (262)
- # clojure-japan (1)
- # clojure-russia (129)
- # clojure-sg (2)
- # clojure-taiwan (2)
- # clojurecup (23)
- # clojurescript (162)
- # clojurex (3)
- # core-async (18)
- # cursive (81)
- # datavis (183)
- # datomic (23)
- # emacs (2)
- # funcool (25)
- # ldnclj (82)
- # lein-figwheel (3)
- # om (196)
- # onyx (74)
- # parinfer (7)
- # portland-or (12)
- # re-frame (60)
- # reagent (48)
- # slack-help (1)
- # yada (9)
@michaeldrogalis: nice - is that focusing more on the EL or is there going to be a something for the T bit too ?
@mccraigmccraig: Will add more T as needed. Need to be careful not to try and rebuild Onyx into a tool, so err'ing on the side of less transformational power.
see questions about this sql->datomic all the time
@lucasbradstreet, sorry to bother, but i'm trying to debug some metrics issues we're having. the
:complete-latency
metrics don't seem to be sending at all, all the others are though. this seems to be the case for the timbre appender too (not just the custom writer i'm working on). these never seem to happen: https://github.com/onyx-platform/onyx-metrics/blob/0.8.x/src/onyx/lifecycle/metrics/metrics.clj#L104-L136 , but i'm baffled because i'm doing a (info ...) in there to check , and they are outputting metrics, it's like they are being dropped by the channel before they can be taken off the other side. I tried limiting the metrics to just one task, because I figured maybe the dropping buffer is the culprit, but nothing comes through still. i'm quite baffled. 😐Are you using the latest version?
Actually I lie
It was throughput
Are the metrics on an input task?
And only that metric?
Do you get retries?
That is super weird
I assume your test is long enough to run for 10 seconds?
I'm out grabbing some food but can help when I get back
It seems to pass the event
Err test
https://github.com/onyx-platform/onyx-metrics/blob/0.8.x/test/onyx/metrics/send_test.clj
That checks whether events with the complete latency tags were sent
Hmm. Are you getting non zero retries?
Maybe it's not completing any message so it's always nil and never puts the event
Ah yes, the value in the log entries you posted have :value nil
Hmm. Although we do still seem to put them on the channel
oh yeah, but there are entries that aren't nil, and they still don't get sent, also there doesn't appear to be anything stopping it from putting if the value is nil? actually the read-log (an input task) doesn't seem to be working at all. it outputs throughput, and batch once and then after that it doesn't do anything
Yeah, was a false track
Try running the send test with your logging still in there
(the above values are being logged from the send-fn thread, but notice there's still no complete latency)
Yeah it should be with the retry rate
Actually retry rate should be within the when that checks if it's an input task
Though it should still log every second
Checking the logs with grep, not eyeballing right?
It's a real puzzler
Nothing in onyx.log?
That's ok because the test only lasts around 10a
Retry is sampled every second. Complete latency every 10s
My current suspicion is that it's outputting it in your main code, but only 1/10th of the time you see it with a retry so you may be missing it
you're right the problem seems to be in my code, it looks like its a silent assertion error, which doesn't pop up anywhere, causing it, on one of my fns which transforms the metric to a statsd compatible one
Ok, I was going to say you may want to be careful in your future thread to catch exceptions
Ok, I was going to say you may want to be careful in your future thread to catch exceptions
But you said it didn't happen in timbre which confused matters :)
yeah i missed it because it only happens 1/10th of the time like you said, lesson learned, always grep 😛 or log so only that can appear
Glad it’s working, and looking forward to the PR :d
happy to PR if you want to add in built in metrics for datadog flavoured statsd? not sure if you guys want that 😛 if you don't we'd probably release it as an onyx metrics plugin on its own
@greywolve: I think the thing to do is omit sending the metric. Most of these dashboards have configuration that lets you pick what to display when a metric is missing for a time interval. Correct me if I'm wrong though.
I would say “no value” for completion latency, not 0
it really depends on the metrics you’re calculating