Fork me on GitHub
#om
<
2016-12-18
>
drcode23:12:37

@dzannotti : The tricky part is that a single om.next query may require calls to multiple RESTful endpoints. The way I address this is to put an om-next-style "server query parser" paradoxically in the client (alongside the separate client query parser) then when I return a value from a server parser read/mutate function I return it as {:value (go ...)} which wraps any restful async call in a core.async go block. Then when a parent read/mutate function resolves its children it first "collapses" the child go blocks into the parent with a function called async-query-result:

;;Note: still alpha code
(defn async-query-result [result]
  (let [val-chan (as/chan)]
    (go (doseq [[k v] result]
          (>! val-chan [k (if (map? v)
                            (<! (:result v))
                            (<! v))]))
        (as/close! val-chan))
    (go (<! (as/reduce (fn [acc [k v]]
                         (assoc acc k v))
                       {}
                       val-chan)))))
Now in a parent read/mutate you can do (async-query-result (parser ...CHILDQUERY...)) and the go blocks in the children get composed into the parent go block. This is all easy to do, since the parsing system in om.next is just functions without any hidden magic. In short, this allows me to run recursive parsing operations "lazily" and if any point of the query parsing requires an async query against a restful endpoint it just waits for the call to complete- I have "client" read/mutate functions that update optimistically in the normal way, and a set of "server" read/mutate functions (but also on the client, since with a restful endpoint there is no server to run our code) that use this lazy strategy and then update the UI with additional information as soon as all necessary restful requests have completed.