This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2024-02-24
Channels
- # babashka (6)
- # beginners (11)
- # calva (4)
- # clojure (12)
- # clojure-madison (2)
- # clojure-norway (25)
- # clojure-spec (8)
- # clojure-sweden (1)
- # clojurescript (17)
- # datalevin (1)
- # datomic (8)
- # events (5)
- # ghostwheel (1)
- # hyperfiddle (16)
- # off-topic (16)
- # pedestal (1)
- # reagent (6)
- # reitit (1)
- # releases (3)
- # scittle (1)
- # shadow-cljs (5)
- # specter (2)
- # squint (4)
Thanks to Datomic's incredible indelible property, I have written many forensic analysis functions roughly like this:
• Query the history database for the transactions affecting a specific entity of interest.
• Iterate over the tx values constructing an as-of
database and then pulling the entity of interest.
Here's an example function:
(defn token-history
[{conn :datomic/connection :as system} token-id]
(let [db (d/db conn)
h (d/history db)
txs (map first (d/q {:query '{:find [?tx]
:in [$ $h ?token-id]
:where [[?t-eid :st.oauth.token/id ?token-id]
[$h ?t-eid _ _ ?tx]]}
:args [db h token-id]}))]
(sequence (map (fn [tx] (let [db (d/as-of db tx)]
(d/pull db {:selector [:st.oauth.token/status
:st.oauth.token/expires-at
:st.oauth.token/refresh-attempts
{:st.oauth.token/updated-by [:db/txInstant]}]
:eid [:st.oauth.token/id token-id]}))))
(sort txs))))
The results provide powerful insight into how my application works (and did work). Do others have a better pattern for this kind of two-layer query? I can imagine that if the as-of
function were available inside a query (to dynamically construct a new source) then it could be done in one query (using the undocumented pull option to specify the source database). But I'm quite satisfied even without that optimization.I think pull
would basically be a wildcard search and provide you with a very long list of all the things.
I am working with bigdec's and want to adhere to the advice https://docs.datomic.com/cloud/schema/schema-reference.html:
> Consistent results in query depend on the scale matching for all BigDecimal comparisons. You are strongly encouraged to use a consistent scale per attribute.
Let's say I have :order/price
that I want to have a scale of 10 and :order/qty
that should have a scale of 15.
The two approaches that come to mind are
• Wrap transact
in a way that allows me to coerce particular bigdec values to particular scales according to the attribute
• Be disciplined at the places in my code where I'm introducing the values and make sure to set the scale there, and enforce this with :db.attr/preds
on the attribute schema
I'm curious what people's general preferences are relating to these types of transformations and if there's any I'm missing?
Are there any helper libraries or functions floating around to deal with such transformations/coercion?
The general problem of coercion & transformation, I personally really like malli schema for that.
As for how to exact 'hook point', where you plugin that in, the standard answer is probably :db.attr/preds
, beyond there's a number of directions you can go all a bit unorthodox, you mentioned a couple already.
my personal preference here is a hack, I dry-run all transactions with d/with
, before sending them, i'll either do this only in DEV builds hopefully catch all the bugs there, or even do it in production if i'm comfortable with extra latency on transaction processing. (also dry-running has the advantage of filtering out any transactions that are 100% chance going to fail, before sending it over the wire to the transactor) When Datomic is processing tx-data and expanding tx-fns in-process because of d/with, it'll invoke the clojure.lang.Function stored at :fnref
(in the record/map returned by d/function), and i'll actually inject malli's function instrumentation right there. https://github.com/metosin/malli/blob/master/docs/function-schemas.md So that I can write an fspec to the transaction function. Can't say i'd recommend this approach, but this is where I landed.