Fork me on GitHub
#datomic
<
2018-12-04
>
rnandan27305:12:01

Thanks everyone for your insights on my question on datomic vs qldb

Andreas Edvardsson06:12:16

Am I thinking to boldly about Datomic, or is it possible to have a entirely purely functional non-trivial buisiness logic, similar to how one could do it with a plain closure map? That is, passing in the state (database) at the top, then adding, updating, retracting, getting etc. in any combination, and lastly do something similar to a "swap!" or "reset!"? Preferably I'd like to get a "new" database instance back after every step, so that I can continue to query and alter the database state until I'm happy. Also, if something goes wrong in the middle I obviously don't want anything committed to the actual database. I imagine transactor functions as in Datomic ions ought to make this possible?

val_waeselynck07:12:14

That's not exactly how it works, but you can essentially achieve the same power. Datomic Connections and Database values essentially correspond respectively to Clojure Agents and values. However, unlike regular Clojure values, you don't update a Database value by calling conj, assoc etc. - you do it by emitting Datoms. Transaction functions enable you to accept a 'present' Database value, optional additional arguments, and to emit an arbitrary set of Datoms which will be added to form the next Database value.

val_waeselynck07:12:56

So the model is not that you fabricate intermediary database values and 'commit' to the last one. However, you can use speculative writes (aka db.with()) to 'preview' what Database value will be yielded by adding the Datoms you're considering.

Andreas Edvardsson12:12:54

Thank you @U06GS6P1N for the reply! It remains then to figure out a reasonable architecture that lets me output the datoms in a sensible way. :) Also, regarding unit testing, is there any way to get an in-memory equivalent to an datomic cloud/ions database value that supports the same query/pull API? The idea is to transact (plain) datoms to a clear database, and then query it using the function I'm presumably testing. Would either the datomic Free database or an in-memory peer with a wrapper like https://github.com/ComputeSoftware/datomic-client-memdb/blob/master/README.md do the trick? By the way, I found your awesome blog the other day, I have yet not managed to digest all the datomic-specific parts yet, but I have still learned a lot, especially the post about event sourcing was enlightening! 👏

Chris Bidler13:12:35

At my job we are using ‘on-prem’ Datomic run on AWS EC2 because we considerably pre-date the advent of Cloud and Ions, so we haven’t had a need to wrap the in-memory peer, but otherwise we do exactly as you describe for unit tests and CI. We have a manageable amount of test data that just lives in source control as datoms in EDN, and before a test run we stand up an in-mem DB, transact our schema to it, and transact in the test data.

Chris Bidler13:12:05

A nice knock-on effect is that we also have a server-dev Boot task which stands up that same “test” in-mem DB and also starts up our backend services locally, pointed to it. One fully-operational but local backend stack, ready to attach to a REPL, please! 😄

Andreas Edvardsson14:12:28

That's awesome! 😄 I hope that I'll be able to construct something similar.

Joe Lane16:12:36

@UEJ28A9PH I think you can attempt that by using d/with-db and d/with, although the bookkeeping of what data is tx-data would be your responsibility.

Andreas Edvardsson20:12:44

Thanks for the suggestion @U0CJ19XAM ! However, the scenario I asked for in the first post does not seem to be very idiomatic, so I believe I'd be better off just doing it "the right way" instead. I guess the d/with and d/with-db could be helpful in some situations though, especially in tests maybe?

Andreas Liljeqvist16:12:14

In our SPA we do a lot of local ui-bookkeeping resulting in a couple of keys that won't be accepted by Datomic for transact. What is a good way to dissoc them? Or selecting relevant keys. We have a spec for what a complete entity should be.

Andreas Liljeqvist17:12:18

It sure would, but I would rather not repeat them again

Andreas Liljeqvist17:12:21

Also the changes are nested in some cases

Andreas Liljeqvist17:12:17

components can also have local additions

Andreas Liljeqvist17:12:12

I suppose the only way would be with some spec magic, but I think it is frowned upon

shaun-mahood17:12:38

Maybe take a look at https://github.com/metosin/spec-tools - it has tools for coercion, transformations and walking nested specs (including stripping out extra keys)

Andreas Liljeqvist17:12:24

Seems interesting, thanks!

lwhorton18:12:48

having worked with datomic cloud for the last 3 weeks i’m ruined to postgres. i want to thank and shame everyone responsible for this. /rant

🎉 12
lwhorton18:12:15

now i have to go find a way to reify transactions, write triggers into an audit table, and compose some horrific transactions and queries

grzm20:12:37

When using the Datomic cloud client, (d/db ,,,) returns a :database-id value as well as the db-name. This :database-id value isn’t returned by (d/db ,,,) when running in the cloud AFAICT. Can someone confirm this? I’m looking to get a globally unique, consistent value for the database across connections. Alternative ideas welcome.

eraserhd20:12:12

You mean, two databases are equal when they have equivalent facts?

eraserhd20:12:38

I've used db values themselves as keys, although there could be two equivalent databases.

eraserhd20:12:19

And just in case, since I'm trying to get my company to open source this library, what are you doing?

grzm20:12:27

Nope, making sure I’m connected to the same database I thought I was before. We churn through databases (and Datomic Cloud stacks, for that matter) and I want to be able to confirm I’m connected to the same database I was before.

matthavener21:12:27

@andreas862 we namespace all the ui artifact keys with :ui/, and then we just use clojure.walk to remove all the keys that (= "ui" (namespace k))

Andreas Liljeqvist10:12:43

Thanks, I think that will be the simplest solution

matthavener21:12:09

fwiw, you could also build a list of all idents the db knows about and and then walk the transaction data to remove anything invalid