Fork me on GitHub
#datomic
<
2020-09-17
>
Stefan09:09:30

Hi all, I’m hoping that this is not a FAQ that I missed somehow, but this is what I’d like to accomplish: for my unit tests, I’d like to be able to pass a simple Clojure hashmap into Datomic query functions, instead of a real Datomic connection, so that I can test my queries without actually round-tripping to a database. Is there something out there to do this? Or am I on a wrong track here?

pithyless09:09:24

You can actually pass in a vector of datum tuples as your DB and query can unify them. But that's probably not what you're looking for. Why not just create an in-memory datomic connection? Something like:

(str "datomic:mem://" (gensym))
You may also be interested in a tool like https://github.com/vvvvalvalval/datomock for setting up and reusing more complex test data scenarios

pithyless09:09:26

I suppose none of this is relevant if you're using Datomic Cloud. Seems to be a primary driver for releasing https://docs.datomic.com/cloud/dev-local.html

thumbnail09:09:26

note that passing datum tuples only works for the peer library, not for the client library iirc.

Stefan10:09:21

@U05476190 We’re using on-prem, so… I’ve indeed found datomock, which is a nice concept, but then you still need to specify both a schema and the test data; I was hoping for something even simpler 😉

val_waeselynck17:09:00

@UGNFXV1FA I find this use case strange. Wouldn't you have more confidence in your tests if they ran in an environment more similar to production?

val_waeselynck17:09:15

I personally find it hugely advantageous to have a full-featured implementation of Datomic in-memory, I would recommend embracing it

Stefan07:09:50

@U06GS6P1N Yeah we’re already experimenting with that, and maybe it’s good enough. But if those tests take 1 second each because of setup/teardown of Datomic databases, that’s too long for me. For unit tests, I prefer to keep things as lean as possible.

val_waeselynck13:09:39

@UGNFXV1FA forking solves that problem

val_waeselynck13:09:11

Put a populated db value in some Var, and then create a Datomock connection from it in each test, there's virtually no overhead to this

Stefan13:09:33

Sounds good, will definitely try, thanks! 🙂

souenzzo15:09:55

I'm running datomic on-prem@dynamodb Can I generate a "AWS Event"¹ on every transaction? ¹ AWS Event is something that i can plugin into lambda/SNS/SQS

bhurlow14:09:38

We use the transaction report queue to push data into a kinesis stream, then run lambdas on those events

bhurlow14:09:21

Triggering side effects on Dynamo db writes is likely not what you want since datomic is writing full blocks to storage (not a datom at a time)

souenzzo15:09:30

@U0FHWANJK when running on multiple/scalled instances, how do you manage the tx-report-queue?

bhurlow15:09:07

We run a single, global process which just subscribes to the queue and pushes events to kinesis

bhurlow15:09:20

other Datomic traffic is scaled horizontally but doesn't invoke the queue

bhurlow15:09:52

Kinesis -> Lambda integration works reasonably well

bhurlow15:09:17

one bonus is you can do one queue to many lambda consumers

souenzzo15:09:36

@U0FHWANJK can you share which instance size you use for this report-queue?

bhurlow15:09:40

subscribing to the tx report queue and putting into lambda is not a very intensive process

bhurlow15:09:48

t3.large would be fine imo

parrot 3
joshkh15:09:31

hmm, i don't suppose you know if something like the "transaction report queue" is available on Datomic Cloud, do you? i have often been in need of exactly what souenzzo mentioned, but instead settled for querying / sipping the transaction log on a timer

bhurlow15:09:13

I'm not sure about cloud, have only used the above in on-prem

bhurlow15:09:22

I'd assume it's inside the system but possibly not exposed

val_waeselynck15:09:31

Clients don't have a txReportQueue indeed. Polling the Log is usually fine IMO (and having a machine dedicated solely to pushing events seems wasteful, and it's also fragile as it creates a SPoF).

souenzzo15:09:55

I work with datomic cloud and datomic on-prem (on different products) IMHO, datomic on-prem still way easier/flexible then cloud Cloud has to many limitations. You can't edit IAM for example, and if you edit, you break any future updates.

👍 3
joshkh15:09:07

thanks guys

val_waeselynck15:09:25

One interesting construction might be using AWS Step Functions + Lambda for polling the Datomic Log into Kinesis, using the Step Functions state to keep track of where you are in consuming the Log

donyorm21:09:55

Looking to try and figure out how to handle sessions/authentication with ions, is there a best practice for that in ions? https://forum.datomic.com/t/best-way-to-handle-session-in-ions/1630

kenny23:09:44

Just confirming, it's okay to pass a db created with datomic.client.api/db to datomic.client.api.async/q, correct?