Fork me on GitHub
#pathom
<
2022-04-11
>
Joe R. Smith00:04:07

Hello! I could use a little direction figuring out how to get a "boundary-interface"-defined pathom endpoint to support introspection and tracing with Pathom-Viz. The only docs I can find for pathom+pathom-viz suggest using the pathom-viz-connector function connect-env, which starts a webserver. I want to use it against a route defined in my Pedestal service.

wilkerlucio01:04:02

hello Joe, if you expose Pathom via some HTTP handler using the boundary interface at the border, that's all you need to connect to it via the HTTP connection feature on Pathom Viz

wilkerlucio01:04:09

there is + button on top right of the app

wilkerlucio01:04:34

the connector is more for development time, to connect to something in staging/prod you can use the HTTP path

Joe R. Smith03:04:19

Thanks Wilker-- should tracing and introspection work?

Joe R. Smith14:04:44

It looks like my issue is my transit body interceptor isn't able to serialize resolvers correctly during index fetch. I'm seeing errors like this:

{:type java.lang.RuntimeException
   :message "java.lang.Exception: Not supported: class io.crescentinvest.api.pathom$user_count__56506"
   :at [com.cognitect.transit.impl.WriterFactory$1 write "WriterFactory.java" 65]}
do I need to register a transit write handler / use a special transit writer?

wilkerlucio14:04:51

yes, there is a transit namespace on pathom with read and write handlers for the custom map types

1
Joe R. Smith14:04:35

thanks-- figured out the write handlers, no more errors, but "refresh indexes" doesn't show any introspection results.

Joe R. Smith14:04:41

I'll check out that tutorial, thanks

Joe R. Smith14:04:20

ahh, probably not encoding metadata 🙂

Ben Grabow21:04:45

Has anyone explored the idea of describing/validating the shape of inputs/outputs of resolvers using clojure spec/malli/etc? It looks like I could define my own plugin that applies to resolver calls (https://pathom3.wsscode.com/docs/plugins#pcrwrap-resolve), looks at the input/output keys, looks up specs in a registry, and validates the inputs/outputs for each resolver. Is there any prior art in this space?

Ben Grabow21:04:34

If I don't validate the input/output values at runtime, then I could publish whatever schema I want and my users would be left guessing what happens at runtime. Giving my users some peace of mind that the values do match the schema I publish would be a big win. The second level would be to advertise each key's schema in Pathom Viz.

wilkerlucio21:04:18

yes, I played with that, I made a small library that I called eql-schemas, which basically does the validation if a DS based on the shape + specs: https://gist.github.com/wilkerlucio/71d2ac918e6dc116576b64b1f506c58d

Ben Grabow21:04:35

Do you have any thoughts on the best approach? I can think of two ways to guarantee coverage of the entire process: • Spec the query, and spec the output of each resolver • Spec the input of each resolver, and the output of each resolver The second way will involve a lot of duplicated work as data flows out of one resolver (validated) and flows into another resolver (validated again). It does not require parsing the EQL query though, and works with smart maps.

Ben Grabow21:04:13

Another question: Did you ever try creating a Pathom plug-in for this?

Ben Grabow21:04:09

Which extension point would you use?

wilkerlucio21:04:13

I never done, I would avoid it in production because the overhead, but could be something useful to assist during development, making a plugin for it is quite easy, using wrap-resolve you can check both the input/output of resolvers and apply on it

🙏 1
wilkerlucio21:04:21

note this library I sent to you uses a different optional flag (the reason in my case is because my usage for it was outside pathom), you can either change the eql-schema code ot use the same optional attribute as Pathom, or you have to convert it on the plugin (to allow for optional things)

wilkerlucio22:04:43

(or in short, replace every ::optional? there with ::pco/optional?)

Björn Ebbinghaus10:04:19

@UANMXF34G I have a plugin for this. I add specs to mutations.

(def spec-plugin
  {::p/wrap-mutate
   (fn [mutate]
     (fn [env sym params]
       (if-let [spec (get-in env [::pc/indexes ::pc/index-mutations sym ::s/params])]
         (if (s/valid? spec params)
           (mutate env sym params)
           (do
             (log/debug (s/explain spec params))
             ;; TODO Errors are data too!
             (throw (ex-info "Failed validation!" (s/explain-data spec params)))))
         (mutate env sym params))))})
https://github.com/hhucn/decide3/blob/master/src/main/decide/server_components/pathom.clj#L49 And then:
(defmutation add-participant [env {user-id ::user/id, slug ::process/slug}]
  {::pc/params [::process/slug ::user/id]
   ::pc/output [::process/slug]
   ::s/params (s/keys :req [::process/slug ::user/id])})
It is relatively old and more of an experiment that stuck. You can probably improve it.

🙏 1
Ben Grabow13:04:55

Here's what I came up with:

(defn key-errors
  [schema-registry [k v]]
  (let [schema (get schema-registry k)]
    (if schema
      (and
        (not (mc/validate schema v))
        (me/humanize (mc/explain schema v)))
      (timbre/warnf "Missing schema for key %s" k))))

(defn collect-key-errors
  [schema-registry m]
  (->> (for [[k v] m
             :let [errors (key-errors schema-registry [k v])]]
         (when errors [k errors]))
       (remove nil?)
       (into {})
       not-empty))

(def validate-resolver-inputs
  {::p.plugin/id ::validate-resolver-inputs
   ::pcr/wrap-resolve
   (fn wrap-validate-resolver-inputs
     [inner-fn]
     (fn wrapped-validate-resolver-inputs [env input]
       (let [registry (:metosin.malli/registry env)]
         (if-let [key-errors (collect-key-errors registry input)]
           (throw (ex-info
                    (format "Error validating resolver input: %s" {:input-keys (keys input)
                                                                   :key-errors key-errors})
                    {:input-keys (keys input)
                     :key-errors key-errors}))
           (let [output (inner-fn env input)]
             (if-let [key-errors (collect-key-errors registry output)]
               (throw (ex-info
                        (format "Error validating resolver output: %s" {:output-keys (keys output)
                                                                        :key-errors  key-errors})
                        {:output-keys (keys output)
                         :key-errors  key-errors}))
               output))))))})

(def pathom-index
  (-> {:metosin.malli/registry {:some.ns/attr [:map [:foo :string][:bar :nat-int]]}}
    (p.plugin/register [pathom-util/validate-resolver-inputs])
    (pci/register resolvers)))
My assumption is the schema registry will be keyed on pathom attribute keys. The defresolver form doesn't need anything extra in it for this to work, but this design also assumes that all resolvers will want their inputs and outputs validated. I like the idea of attaching metadata to the resolver/mutation directly to say which parts should be validated. That would be useful for more performance-sensitive usages, or when some very specific attrs are causing problems or have mission-critical shape.

Ben Grabow13:04:57

> ;; TODO Errors are data too! I love this. I'm going to start using this comment in my code too. 😆

❤️ 1
wilkerlucio14:04:24

nice 🙂 just one thing, ::p.plugin/id is expected to be a symbol, I should add some validation to check it, but if you run with guardrails on it may complain (because the spec of plugin id is a symbol)

Ben Grabow20:04:13

Interesting! I noticed it was a symbol in the docs but I'm curious why symbols are a better fit here than keywords.