This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-03-09
Channels
- # aws (51)
- # beginners (57)
- # calva (10)
- # chlorine-clover (7)
- # cider (20)
- # clj-kondo (55)
- # clojure (43)
- # clojure-europe (9)
- # clojure-italy (1)
- # clojure-nl (5)
- # clojure-spec (8)
- # clojure-uk (71)
- # clojurescript (33)
- # core-async (22)
- # cursive (20)
- # datomic (3)
- # emacs (8)
- # figwheel-main (8)
- # fulcro (13)
- # garden (2)
- # graalvm (60)
- # graphql (26)
- # jobs (6)
- # joker (6)
- # kaocha (2)
- # lambdaisland (5)
- # malli (36)
- # off-topic (9)
- # portkey (15)
- # re-frame (3)
- # reagent (25)
- # remote-jobs (4)
- # spacemacs (3)
- # sql (111)
- # tree-sitter (29)
- # uncomplicate (3)
- # xtdb (2)
https://github.com/seancorfield/next-jdbc/blob/master/doc/getting-started.md#plan--reducing-result-sets /cc @dharrigan 🙂
(postgres/query
["SELECT * FROM shops WHERE uuid = ?::uuid" uuid]
{:qualifier "shop"})
Do you work with :qualifier
? I am trying to decide if I want or not.
It looks improving readability of data, but on the other hand things like graphql
and everything don’t support namespaces keywords, so it makes things more complex at the same time.
What is your experience? Is it worth to use this? To have namespaced data structure in the system vs not namespaced?
{:user {:id 1
:email ""}
:shops [{:id 1
:engine "foo"}]}
vs
{:user/id 1
:user/email ""
:shops [{:user/id 1
:shop/id 1
:shop/engine "foo"}]}
or something like that, but for example only shop returned
The example is bad, but I more and more I am thinking about this… When make sense to use :qualifier
?oh there is https://github.com/seancorfield/next-jdbc new library
No, for example, if you want to support JSONB there is a nice section here on how to get next jdbc to work with it: https://cljdoc.org/d/seancorfield/next.jdbc/1.0.395/doc/getting-started/tips-tricks
(extend-protocol prepare/SettableParameter
clojure.lang.IPersistentMap
(set-parameter [m s i]
(.setObject s i (->pgobject m))))
clojure.lang.IPersistentVector
(set-parameter [v s i]
(.setObject s i (->pgobject v))))
vs
(extend-protocol prepare/SettableParameter
clojure.lang.IPersistentMap
(set-parameter [m s i]
(.setObject s i (->pgobject m)))
clojure.lang.IPersistentVector
(set-parameter [v s i]
(.setObject s i (->pgobject v))))
There is a bug in the doc, additional )
@kwladyka Thanks for spotting that! Fixed on master.
still thinking about this namespaced keywords like :user/id
in context of graphql
etc. What do you feel about this?
(get-by-id :shops #uuid "00000000-0000-0000-0000-000000000000" :uuid {})
Is a way to make it work without #uuid
?Well yes, But what I mean namespaces keywords from postgresql don’t work with other ideas and systems, which don’t predict namespaced keywords. Mainly in API I guess. How do you deal with this?
(-> (shop-db/get-shop-by-uuid uuid)
(clojure.walk/stringify-keys)
(clojure.walk/keywordize-keys))
each time?by passing this {:builder-fn rs/as-unqualified-lower-maps}
into the ops of the jdbc invocation
yes, but maybe I miss how this should by solved in API like graphql. Maybe this is fine, but I have to see this from different perspective. Or I just shouldn’t use namspeaces keywords 🙂
You can strip the namespace programmically if you wish (into {} (map (fn [[k v]] [(keyword (name k)) v])) m)
@kwladyka Thanks for spotting that! Fixed on master.
Re: qualified keys in SQL results. I try to use the qualified keys as much as possible, unless I have very specific situations where I explicitly do not want them. For APIs, data will come in unqualified (usually) and that can be validated and inserted as-is -- the logic in both c.j.j and next.jdbc
uses just the name
portion of keywords for inserts etc. For returning output, I generally try to coerce the SQL results to plain hash maps anyway because it's rare (in my experience) that you'll be passing the entire result set back to whatever client is consuming that data (outside your code) so that explicit coercion can do the unqualify operation. I think JSON libraries in general will also strip the qualifier when converting from hash maps to JSON but I can't say I've paid much attention to that.
It's worth noting that if you use plan
(in next.jdbc
, or reducible-query
in c.j.j) then you will likely be accessing columns via their plain label anyway and not actually producing full-on Clojure hash maps so you can avoid the whole issue. See the link to the updated docs I posted above.
not sure what is the the conclusion: Do you use :foo/bar
when get data from postgresql or other database and before you add them to response change them to :foo_bar
or something like that?
:foo/bar
for column bar
from table foo
-- letting next.jdbc
handle that automatically.
yes but if you will have to return this data in graphql or other API what will you do?
(`clojure.java.jdbc` handles this differently: it doesn't qualify by default and the :qualifier
option is a fixed string rather than the auto-added table name as in next.jdbc
)
Returning data -- exactly as I described above.
> For returning output, I generally try to coerce the SQL results to plain hash maps anyway because it's rare (in my experience) that you'll be passing the entire result set back to whatever client is consuming that data (outside your code) so that explicit coercion can do the unqualify operation.
For example, I might have a setting column in the DB that is an int and I might want to return it as one of a set of string values.
For GraphQL a pretty simple approach is to just use queries that return JSON (works fine in postgres and SQL Server)
@kwladyka GraphQL doesn't have namespaces, so you would just label stuff by their fieldName in GraphQL
no, i'm not sure why you're considering namespaced keywords at all if you need to return data to GraphQL
I draw a pretty hard line between data in my domain (where I would use namespace-qualified keys) and data I return to clients outside (which I would use plain string keys in general).
I don't view it as "complexity" since I think there should always be a translation layer from your domain data to your external data.
I would not expect your regular result set data to be structured the same as your GraphQL result -- in general.
(result sets are flat vectors of hash maps -- GraphQL results are typically more structured)
Also if all your data is one relational DB, it is better to just transform the query AST to a SQL query directly, instead of relying on resolvers
> translation layer from your domain data to your external data. sure but I feel in many cases I can return data as it is
> transform the query AST to a SQL query directly, instead of relying on resolvers Can you show me code? I don’t what it means.
yeah so in short words I confused myself if I should use namespaced keywords or not. It was looking like a cool idea, but mmm I don’t know.
Basically for a GraphQL query like this:
users(first: 10) {
id
posts { title }
}
Just run a SQL query like this (SQL Server syntax, it is slightly less elegant for postgres, but basic idea the same) :
select top 10 id,
(select title
from posts p
where p.user_id = a.i
for json path) as posts
from users a
for json path
@kwladyka Here is an example of what the AST looks like: https://github.com/walmartlabs/lacinia/blob/master/test/com/walmartlabs/lacinia/parser/query_test.clj#L52-L57
> yeah so in short words I confused myself if I should use namespaced keywords or not. It was looking like a cool idea, but mmm I don’t know.
Like I said: within the domain, I try to use namespaced keywords because it plays nicely with Spec and makes it easier to see what entity :name
or :id
belongs to (because it would be :member/name
, :product/name
, :invoice/id
, etc). When I'm returning data from an API, that (flat) data will be transformed into whatever is more appropriate for the client -- most likely a nested structure -- and that transformation will include converting (qualified) keywords to strings.
If you have clear system boundaries (and you should) then this is a no-brainer.
Oh maybe I will ask in that way: What advantage did you achieve by using namespaced keywords in practice in your systems? What pros it gives to be worth to add complexity to convert nammespaced keywords into fileds for graphql or other API?
I just answered that 🙂
Spec, clarity of domain entities. Those are why Clojure has qualified keywords.
ok this is the answer what I wanted. So you are saying this is still worth to do and you are happy with this choice.
I think it is the right choice in general, yes. Apart from anything else, it reminds you that you have a system boundary that you are passing data across.
(-> (shop-db/get-shop-by-uuid uuid)
(clojure.walk/stringify-keys)
(clojure.walk/keywordize-keys))
Do you convert data similar to above or you foud more sophisticated way?Why not use a JSON library? I assume you trying to produce JSON to return?
but I see, you probably don’t use graphql so this is the point where you don’t feel my concern
We use Lacinia at work -- well, my teammate uses it, I haven't had to deal with that code much yet.
thank you both @seancorfield @isak for advices
My teammate's code uses namespace-qualified keywords very heavily. I'm checking with him whether we do anything specific to work with Lacinia in that regard...
@kwladyka He says "it looks like I manually map all the namespace keywords to keywords that literally match what is in the schema, so for example :storage/message-count -> :message_count" -- which is exactly the pattern I was recommending above: an explicit transformation layer between the domain and the API return values.
That's what I (mostly) do as well. I remap the keys that I puck from my sql queries into json keys (we work in json a lot!)
I see that jsonista
allows you to explicitly map qualified names to unqualified names.
(I was looking for that in Cheshire but didn't see it)
(! 774)-> clj -Sdeps '{:deps {metosin/jsonista {:mvn/version "RELEASE"}}}'
Clojure 1.10.1
user=> (require '[jsonista.core :as j])
nil
user=> (def mapper (j/object-mapper {:encode-key-fn name :decode-key-fn keyword}))
#'user/mapper
user=> (j/write-value-as-string {:hello/world 1} mapper)
"{\"world\":1}"
user=> (j/read-value *1 mapper)
{:world 1}
Yeah, I use cheshire - simply because interia - when I was (still am!!!) learning clojure, Cheshire was my first hit 🙂
Ah, yeah, Cheshire can do it too...
(! 775)-> clj -Sdeps '{:deps {cheshire {:mvn/version "RELEASE"}}}'
Clojure 1.10.1
user=> (require '[cheshire.core :as j])
nil
user=> (j/generate-string {:foo/bar 42} {:key-fn name})
"{\"bar\":42}"
user=> (j/parse-string *1 true)
{:bar 42}
sorry, probably I miss something big here:
(defn fixture-db-transaction [tests]
(jdbc/with-transaction [tx db-psql/db {:rollback-only true}]
(with-redefs [db-psql/db tx]
(tests))))
it doesn’t work even if I do (.rollback tx)
(use-fixtures :each utils/fixture-db-transaction)
why data stay in DB?Does it work like it replace db-psql
with tx
and rollback on the end of tests? why not?
(next.jdbc/transact db-psql/db (fn* [tx] (with-redefs [db-psql/db tx] (tests))) {:rollback-only true})
Your use of a def
is what's problematic there since it is not dynamic -- so it will be compiled into the code that uses it.
If you made it a ^:dynamic
var and used binding
instead of with-redefs
it would work I think.
expectations.clojure.test
gives examples https://cljdoc.org/d/expectations/clojure-test/1.2.1/doc/fixtures-focused-execution
(the same applies to clojure.test
)