This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2021-09-17
Channels
- # announcements (17)
- # aws (2)
- # babashka (21)
- # beginners (67)
- # calva (19)
- # cider (29)
- # clara (3)
- # clj-kondo (6)
- # cljsrn (10)
- # clojure (140)
- # clojure-europe (164)
- # clojure-nl (3)
- # clojure-uk (8)
- # clojurescript (62)
- # conjure (7)
- # core-async (24)
- # cursive (21)
- # datomic (5)
- # docker (40)
- # emacs (14)
- # fulcro (25)
- # gratitude (1)
- # honeysql (6)
- # introduce-yourself (1)
- # jobs (1)
- # jobs-discuss (32)
- # juxt (7)
- # lsp (13)
- # minecraft (2)
- # off-topic (49)
- # pathom (24)
- # practicalli (8)
- # re-frame (18)
- # react (23)
- # remote-jobs (6)
- # reveal (2)
- # shadow-cljs (75)
- # tools-deps (7)
Wrote down some thoughts on our internal subscription layer this morning and thought maybe it’s interesting to share here as well. It’s more questions than answers really but I’m curious if it prompts any reactions/thoughts from other folks working on this type of stuff.
Probably a third approach worth mentioning is using a database type thing on the client that allows for efficient queries combining data (e.g. entitydb or autonormal) but I think that still leaves a few things open, in particular around the time stuff.
the watch vs. delta vs. database thing I think you're right you can kind of boil all of them down to events
one thing I've been thinking about for autonormal is adding delta capabilities to it. since it knows all of the entities you're adding to the map in order to normalize them, I think you can reliably obtain a "which entities changed" delta. so instead of listening to the whole db, you could write a wrapper that listens just to the entities you care about
the bones of this are already in there with add-report
and pull-report
it just needs to be exercised to see if if it's reliable
ofc if your data source already has these delta events then that might be more efficient, assuming the events are at the level of granularity you want
w.r.t. time I think your idea of setting a timer is best. you can have an object that tracks end_dt
and clears and recreates the timer when end_dt
changes
I could imagine something like:
(defn use-timer
[end-dt on-end]
(let [cb (useRef on-end)]
;; ensure ref is updated after each render so we always
;; have the latest on-end fn
(useEffect #(set! (.-current cb) on-end))
(useEffect
(fn []
(let [interval (js/setInterval
#(when (>= (js/Date.now) end-dt)
((.-current cb)))
;; check every 100ms
100)]
#(js/clearInterval interval))
#js [end-dt])))
might work. copped some of this from https://overreacted.io/making-setinterval-declarative-with-react-hooks/Yeah that’s an approach we use in a few places but I’d almost like to handle it outside of react closer to the layer that transforms data for usage in views
(which is totally possible ofc)
@lilactown with autonormal, I was wondering: is there a functionality to identify records with “slots” like in entitydb or idx?
(-> (a/db [])
(a/add {:users {:current {:user/id 1 :user/name "martinklepsch" :user/likes #{"programming"}}}}))
;; => {:users {:current [:user/id 1]}
;; :user/id {1 {:user/id 1 :user/name "martinklepsch" :user/likes #{"programming"}}}
pull queries on :users :current will look up the [:user/id 1]
path in the db to descende.g.
(a/pull *1 [{:users [{:current [:user/name]}]}])
;; => {:users {:current {:user/name "martinklepsch"}}}
> since it knows all of the entities you're adding to the map in order to normalize them, I think you can reliably obtain a "which entities changed" delta. so instead of listening to the whole db, you could write a wrapper that listens just to the entities you care about I think this would be useful for @roman01la’s thoughts around "kafka for the frontend" too. I've personally run into re-frame performance issues where I really just needed a delta & to update indexes myself rather than rebuilding them from scratch.