Fork me on GitHub

I'm trying to create a Nippy serializer for a datom. Is there a way to construct a Datum object with the added field? I see the Datum constructor takes values for e, a, v, and tOp but not added. And it looks like the added field is defined as final, so I'm not sure how that's set (unless reflection is used). Technically I could just deserialize into a vector but it's a tad annoying that the serializer and deserializer are not symmetric.


I guess I could create a Datum record that looks and acts like a datomic.db.Datum. Still a tad annoying that it isn't symmetric 🙃


Ohhhh, I misunderstood. This is how the added field is defined 🙂 Makes sense now!

public boolean isAssertion() {
        return Numbers.isPos(this.tOp & 1L);


For those interested...

(nippy/extend-freeze Datum ::datom
  [^Datum datom ^DataOutputStream data-output]
    [(.-e datom) (.-a datom) (.-v datom) (.-tOp datom)]))

(nippy/extend-thaw ::datom
  [^DataInputStream data-input]
  (let [[e a v tOp] (nippy/thaw-from-in! data-input)]
    (Datum. e a v tOp)))


Updated the README of Datomock - I hope this makes the use cases more clear, and that it shows how powerful datomic.api/with is


@U053032QC related to our previous discussion


I'm really excited to try this out!


I've been using this to great effect to make tests run fast. I'm sad to lose it with cloud/peer API but c'est la vie. Thanks Val, this is a great lib


@U0510KXTU curious about how you used it - only to make tests run fast? I find the most rewarding aspects to be more workflow-related (debugging, dev etc.)


this has been a godsend for our development workflow


@U06GS6P1N I'm also using for test's something like

(let [root-conn (d/connect "mem")]
  (install-schema! root-conn)
  (reset! test-conn (dm/fork-conn root-conn)))
(defn test-fixtures
  ;; no more install schema each fixture !!!
  (reset! system/conn (dm/fork-conn @test-conn))


@U06GS6P1N same as @U2J4FRT2T for fixtures run once but used in N tests


@denik With Datomic Cloud, all data and processing lives in your account, and so you can choose if and how it is exposed to the internet. That said, the defaults (and our development efforts) are pointed at the secure deployment of data-of-record systems.


@kenny out of curiosity, why do you need to serialize concrete Datoms?


We have a series of Onyx tasks that publish segments to a Kafka topic. In order to publish to a Kafka topic you need to have serializable data. We need to publish the :tx-data off of the transaction report for running business logic. The options were either: - Write our own version do transact and transact-async that modify the returned transaction report such that :tx-data was serializable. Then require anyone who transacts data to Datomic to use that interface and know that the interface is slightly different that the Datomic one they were used to. - Write a function that serializes :tx-data in the transactions report and require that all developers remember to call that function before returning their segment. - Write a Datom serializer that allows devs to use the same Datomic API they've been using without needing to remember to call a function or use a special Datomic API only when working with Onyx.


makes sense, thanks for sharing!