This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-10-24
Channels
- # announcements (1)
- # aws (2)
- # beginners (147)
- # boot (19)
- # cider (57)
- # clara (52)
- # cljdoc (18)
- # cljs-dev (14)
- # cljsrn (4)
- # clojure (176)
- # clojure-conj (9)
- # clojure-dev (9)
- # clojure-germany (2)
- # clojure-italy (4)
- # clojure-spec (13)
- # clojure-uk (56)
- # clojurescript (72)
- # code-reviews (11)
- # cursive (17)
- # data-science (1)
- # datomic (52)
- # duct (26)
- # emacs (6)
- # events (9)
- # figwheel (1)
- # figwheel-main (21)
- # fulcro (132)
- # funcool (1)
- # graphql (3)
- # jobs-discuss (42)
- # leiningen (3)
- # luminus (45)
- # mount (10)
- # off-topic (2)
- # re-frame (17)
- # reagent (12)
- # reitit (20)
- # ring-swagger (7)
- # rum (3)
- # shadow-cljs (256)
- # slack-help (15)
- # sql (7)
- # tools-deps (50)
- # uncomplicate (1)
- # yada (9)
Morgen
morning
Bore da
Quick question... Does anyone here have any experience with using Transit and Postgres's JSON stuff...? As in does anyone know how compatible they might be..? Google does not seem to be able to find me anything about Postgres, JSON AND Transit, just Postgres and JSON. This may mean no one's done anything with these three things together, ever, but that seems implausible...
@maleghast are you intending to use Transit to write to Postgres, or are you doing some data processing in Clojure first? Postgres+json seems pretty straight forward http://www.postgresqltutorial.com/postgresql-json/ although you may want to search for issues specifically for postgres and jason. Transit does a good job of keeping the JSON structure so assume there should be no issues (good tests may show otherwise though)
måning
Be careful with pg and json - the perf profile of using it for operational stuff has become a bottleneck on one of the teams I'm on. Granted I'm no expert so maybe there's a way of fixing that with up front design... which was not done by the original implementers in this case
Also, morning from France, heh
> Be careful with pg and json yeah I would be worried about that (although the data guy who advocated it was trying to convince me hard that the newer versions handle it well, so… maybe version dependent?)
It's very easy to trigger full scans if your data is badly modelled was what we found so it really depends on shape of data and access pattern I think?
you can index jsonb fields to avoid that can't you ?
> were people attempting to filter or order based on a sub-field of the json object? This combined with a polymorphic data shape iirc
@mccraigmccraig yes, you can now create indices on fields within Json in postgres
Ah, I am using it as a document store that I can__ query deep into the structure If I have too, but not for operational purposes. I use other fields in the table to store ids, and other index-able stuff for selection queries, a kind of hybrid approach, so that I don’t have to be frightened of the downsides of a pure document database
But all of this is genuinely interesting and worthwhile “watch out for” stuff too, make no mistake ;-)
What was your use case?
no laptop -nice! i currently have to reject holiday locations which don't have decent wifi or 4g 😬
Is a holiday still a holiday if ur taking ur laptop/whatever ur connecting to the internet with :thinking_face:
@jr0cket - Thanks! Turns out that we can use Transit for our use-case but thanks for the answer :-)
Transit is pretty cool. I do wonder how easy it would be to store EDN in Postres natively. I assume this would need some postgres experience
@jr0cket - Well... You could store EDN in a TEXT field and take my approach about using other columns for values that you might want to query on.
But the PostgreSQL implementation of JSONB is not EDN compatible... What would be GREAT is "someone" creating a PostgreSQL extension to treat EDN as the same kind of resource as JSON when you use the JSONB field type.
And it's all open source, right? So we just need to find someone smart enough and then - gravy!
with "the PostgreSQL implementation of JSONB is not EDN compatible" dyu mean that EDN's encoding in JSON makes JSONB indexes not very useful, or that you just can't store EDN->JSON in JSONB columns for some reason @maleghast?
The former, as far as I can tell EDN's encoding means that the advantages of JSONB columns, like the indexing etc. will not work.
I think that JSONB columns would actually reject EDN that is not well-formed JSON, however
i would expect that - was just wondering if there were limits imposed on the JSON stored by JSONB, but you didn't mean that
Of course I've not tested that, but I think that the column-type validates the inbound JSON, in order to allow the deep-querying and indexing, and therefore most EDN would not pass the validation.
oh, hold on, i'm tired and confused... i'm confusing transit (which has a json encoding) with edn (which is its own encoding) ... ignore everything i have said for the last 10 mins
@mccraigmccraig - OK, sure 🙂
Morning 🙂
Clojure workshops update: we are planning to run the following workshops: "Building Clojure apps with Kafka" "clojure.spec from the ground up" Thoughts welcome. https://www.meetup.com/London-Clojurians/events/254693569/
The workshop presenter is using Kafka for financial services. Funding Circle sound like they are doing some interesting things with Kafka too, using multiple topics as part of a data transformation pipeline.
i'd would have like to go, but weekends are sacrosanct family time for me atm
I'll be having a mini breakdown on the Sunday that my slides aren't good enough, I'm not funny enough and my ego is too big for the room and all that stuff. Just a normal Sunday. 🙂
take friday off work... its the only option 😁
you look smarter than everyone else, so no one will notice you only have the one suit