This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-02-18
Channels
- # announcements (5)
- # aws (4)
- # babashka (30)
- # beginners (90)
- # calva (31)
- # clj-on-windows (16)
- # clojure (110)
- # clojure-dev (10)
- # clojure-europe (26)
- # clojure-nl (1)
- # clojure-norway (20)
- # clojure-spec (25)
- # clojure-uk (15)
- # clojured (2)
- # clojurescript (12)
- # code-reviews (2)
- # community-development (3)
- # conjure (14)
- # datomic (15)
- # defnpodcast (2)
- # events (1)
- # fulcro (17)
- # graalvm (8)
- # gratitude (1)
- # introduce-yourself (2)
- # jobs-discuss (7)
- # kaocha (6)
- # lsp (9)
- # luminus (5)
- # nextjournal (7)
- # observability (9)
- # off-topic (71)
- # portal (5)
- # practicalli (1)
- # rdf (21)
- # re-frame (15)
- # releases (1)
- # shadow-cljs (24)
- # testing (7)
- # tools-build (13)
- # tools-deps (14)
- # xtdb (7)
For my understanding: When I don't see the data part of log messages like (log/error "Foo" {:bar "baz"}
parsed in elasticsearch but only as a string. Does it mean that the setup is wrong? Should I work on the parsing-config in e.g. logstash? Or is this the way it is?
I never used logstash, but I worked with fluentd and yes - you have to setup a parser on logstash side, and/or emit JSON from your application logs as well
ok, actually it seems to me that we are using fluentd as well at my new job...will look into that. Thanks!
we're prepping a POC with http://vector.dev
good to know...I would love to use ulog for application-logs anyway so that there is no need to go through fluentd or something similar anymore