This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2018-01-22
Channels
- # architecture (30)
- # beginners (56)
- # cider (16)
- # cljs-dev (12)
- # cljsrn (21)
- # clojure (169)
- # clojure-austin (1)
- # clojure-estonia (1)
- # clojure-italy (3)
- # clojure-russia (1)
- # clojure-spec (56)
- # clojure-uk (46)
- # clojurescript (53)
- # consulting (3)
- # core-async (3)
- # cursive (14)
- # data-science (16)
- # datascript (1)
- # datomic (26)
- # defnpodcast (11)
- # docs (3)
- # emacs (6)
- # fulcro (4)
- # graphql (24)
- # hoplon (8)
- # instaparse (4)
- # java (2)
- # jobs (1)
- # jobs-rus (1)
- # jobs_rus (1)
- # keechma (1)
- # luminus (2)
- # lumo (1)
- # mount (36)
- # off-topic (30)
- # om-next (5)
- # onyx (29)
- # precept (23)
- # re-frame (20)
- # reagent (2)
- # remote-jobs (9)
- # ring (2)
- # ring-swagger (3)
- # rum (3)
- # shadow-cljs (100)
- # spacemacs (17)
- # sql (10)
- # timbre (2)
- # unrepl (29)
- # yada (2)
the specs for resolvers and streamers specify fn? which fails for vars, which is kind of a drag
I suspect you are concerned about code reloading. That's a bit of a challenge to do right.
My work flow is to start and stop systems, often. My field resolvers are often components, where code reloading doesn't work without a system rebuild anyway.
I've been passing all the component bits in the context, so my resolvers are mostly functions
That works, but it often means you have an Ubercomponent that knows about too much of the rest of the system.
The context can be used to pass such dependencies in trivial systems, but I see the component approach as the one for larger systems.
I wouldn't be surprised if this ended up there, but at the moment I am more or less porting another api over to this, and that other api has all the bits the graphql api needs at hand to pass in
Hey, a bit confused by this page in the lacinia docs: http://lacinia.readthedocs.io/en/latest/custom-scalars.html#attaching-scalar-transformers > As with field resolvers, the pair of transformers for each scalar have no place in an EDN file as they are functions. Instead, the transformers can be attached after reading the schema from an EDN file, using the function com.walmartlabs.lacinia.util/attach-scalar-transformers.
I tried defining a scalar with {:parse :my.special/keyword}
but I got a spec error:
In: [0 :scalars :JavaDate 1 :parse] val: :stillsuit.scalars/parse-edn fails spec: :com.walmartlabs.lacinia.schema/parse at: [:args :schema :scalars 1 :parse] predicate: spec?
Seems like Lacinia could support keywords for scalars, and expect them to be specs, and wrap them as appropriate.
I am calling attach-scalars, yeah. Hmm, possibly not in the right order though
It would be handy to have functionality similar to the resolver factories there, BTW, so I could use {:parse [:my/resolver :bigdec]}
or whatnot
Looks something like this:
(defn ^:private load-schema
[component]
(-> "schema.edn"
io/resource
slurp
edn/read-string
(util/attach-scalar-transformers
{:timestamp-parse utils/timestamp-parse
:timestamp-serialize utils/timestamp-serialize
:uuid-parse utils/uuid-parse
:uuid-serialize utils/uuid-serialize})
(util/attach-resolvers (:resolvers component))
schema/compile))
(def timestamp-parse
(as-conformer (fn [^String v]
(ZonedDateTime/parse v))))
(def timestamp-serialize
(as-conformer (fn [^ZonedDateTime v]
(.format v DateTimeFormatter/ISO_DATE_TIME))))
(def uuid-parse
(as-conformer (fn [^String v]
(UUID/fromString v))))
(def uuid-serialize
(as-conformer (fn [^UUID v]
(str v))))
:scalars
{:Timestamp
{:parse :timestamp-parse
:serialize :timestamp-serialize}
:UUID
{:parse :uuid-parse
:serialize :uuid-serialize}}
Thanks, that's helpful. I'll mess with it some more