Fork me on GitHub

Hi all, I’m hoping that this is not a FAQ that I missed somehow, but this is what I’d like to accomplish: for my unit tests, I’d like to be able to pass a simple Clojure hashmap into Datomic query functions, instead of a real Datomic connection, so that I can test my queries without actually round-tripping to a database. Is there something out there to do this? Or am I on a wrong track here?


You can actually pass in a vector of datum tuples as your DB and query can unify them. But that's probably not what you're looking for. Why not just create an in-memory datomic connection? Something like:

(str "datomic:mem://" (gensym))
You may also be interested in a tool like for setting up and reusing more complex test data scenarios


I suppose none of this is relevant if you're using Datomic Cloud. Seems to be a primary driver for releasing


note that passing datum tuples only works for the peer library, not for the client library iirc.


@U05476190 We’re using on-prem, so… I’ve indeed found datomock, which is a nice concept, but then you still need to specify both a schema and the test data; I was hoping for something even simpler 😉


@UGNFXV1FA I find this use case strange. Wouldn't you have more confidence in your tests if they ran in an environment more similar to production?


I personally find it hugely advantageous to have a full-featured implementation of Datomic in-memory, I would recommend embracing it


@U06GS6P1N Yeah we’re already experimenting with that, and maybe it’s good enough. But if those tests take 1 second each because of setup/teardown of Datomic databases, that’s too long for me. For unit tests, I prefer to keep things as lean as possible.


@UGNFXV1FA forking solves that problem


Put a populated db value in some Var, and then create a Datomock connection from it in each test, there's virtually no overhead to this


Sounds good, will definitely try, thanks! 🙂


I'm running datomic [email protected] Can I generate a "AWS Event"¹ on every transaction? ¹ AWS Event is something that i can plugin into lambda/SNS/SQS


We use the transaction report queue to push data into a kinesis stream, then run lambdas on those events


Triggering side effects on Dynamo db writes is likely not what you want since datomic is writing full blocks to storage (not a datom at a time)


@U0FHWANJK when running on multiple/scalled instances, how do you manage the tx-report-queue?


We run a single, global process which just subscribes to the queue and pushes events to kinesis


other Datomic traffic is scaled horizontally but doesn't invoke the queue


Kinesis -> Lambda integration works reasonably well


one bonus is you can do one queue to many lambda consumers


@U0FHWANJK can you share which instance size you use for this report-queue?


subscribing to the tx report queue and putting into lambda is not a very intensive process


t3.large would be fine imo

parrot 3

hmm, i don't suppose you know if something like the "transaction report queue" is available on Datomic Cloud, do you? i have often been in need of exactly what souenzzo mentioned, but instead settled for querying / sipping the transaction log on a timer


I'm not sure about cloud, have only used the above in on-prem


I'd assume it's inside the system but possibly not exposed


Clients don't have a txReportQueue indeed. Polling the Log is usually fine IMO (and having a machine dedicated solely to pushing events seems wasteful, and it's also fragile as it creates a SPoF).


I work with datomic cloud and datomic on-prem (on different products) IMHO, datomic on-prem still way easier/flexible then cloud Cloud has to many limitations. You can't edit IAM for example, and if you edit, you break any future updates.

👍 3

thanks guys


One interesting construction might be using AWS Step Functions + Lambda for polling the Datomic Log into Kinesis, using the Step Functions state to keep track of where you are in consuming the Log


Looking to try and figure out how to handle sessions/authentication with ions, is there a best practice for that in ions?


Just confirming, it's okay to pass a db created with datomic.client.api/db to datomic.client.api.async/q, correct?