This page is not created by, affiliated with, or supported by Slack Technologies, Inc.


Thanks everyone for your insights on my question on datomic vs qldb


Am I thinking to boldly about Datomic, or is it possible to have a entirely purely functional non-trivial buisiness logic, similar to how one could do it with a plain closure map? That is, passing in the state (database) at the top, then adding, updating, retracting, getting etc. in any combination, and lastly do something similar to a "swap!" or "reset!"? Preferably I'd like to get a "new" database instance back after every step, so that I can continue to query and alter the database state until I'm happy. Also, if something goes wrong in the middle I obviously don't want anything committed to the actual database. I imagine transactor functions as in Datomic ions ought to make this possible?


That's not exactly how it works, but you can essentially achieve the same power. Datomic Connections and Database values essentially correspond respectively to Clojure Agents and values. However, unlike regular Clojure values, you don't update a Database value by calling conj, assoc etc. - you do it by emitting Datoms. Transaction functions enable you to accept a 'present' Database value, optional additional arguments, and to emit an arbitrary set of Datoms which will be added to form the next Database value.


So the model is not that you fabricate intermediary database values and 'commit' to the last one. However, you can use speculative writes (aka db.with()) to 'preview' what Database value will be yielded by adding the Datoms you're considering.


Thank you @ for the reply! It remains then to figure out a reasonable architecture that lets me output the datoms in a sensible way. :) Also, regarding unit testing, is there any way to get an in-memory equivalent to an datomic cloud/ions database value that supports the same query/pull API? The idea is to transact (plain) datoms to a clear database, and then query it using the function I'm presumably testing. Would either the datomic Free database or an in-memory peer with a wrapper like do the trick? By the way, I found your awesome blog the other day, I have yet not managed to digest all the datomic-specific parts yet, but I have still learned a lot, especially the post about event sourcing was enlightening! :clap:


At my job we are using ‘on-prem’ Datomic run on AWS EC2 because we considerably pre-date the advent of Cloud and Ions, so we haven’t had a need to wrap the in-memory peer, but otherwise we do exactly as you describe for unit tests and CI. We have a manageable amount of test data that just lives in source control as datoms in EDN, and before a test run we stand up an in-mem DB, transact our schema to it, and transact in the test data.


A nice knock-on effect is that we also have a server-dev Boot task which stands up that same “test” in-mem DB and also starts up our backend services locally, pointed to it. One fully-operational but local backend stack, ready to attach to a REPL, please! :smile:


That's awesome! :smile: I hope that I'll be able to construct something similar.


@ I think you can attempt that by using d/with-db and d/with, although the bookkeeping of what data is tx-data would be your responsibility.


Thanks for the suggestion @ ! However, the scenario I asked for in the first post does not seem to be very idiomatic, so I believe I'd be better off just doing it "the right way" instead. I guess the d/with and d/with-db could be helpful in some situations though, especially in tests maybe?


In our SPA we do a lot of local ui-bookkeeping resulting in a couple of keys that won't be accepted by Datomic for transact. What is a good way to dissoc them? Or selecting relevant keys. We have a spec for what a complete entity should be.


It sure would, but I would rather not repeat them again


Also the changes are nested in some cases


components can also have local additions


I suppose the only way would be with some spec magic, but I think it is frowned upon


Maybe take a look at - it has tools for coercion, transformations and walking nested specs (including stripping out extra keys)


Seems interesting, thanks!


having worked with datomic cloud for the last 3 weeks i’m ruined to postgres. i want to thank and shame everyone responsible for this. /rant


now i have to go find a way to reify transactions, write triggers into an audit table, and compose some horrific transactions and queries


When using the Datomic cloud client, (d/db ,,,) returns a :database-id value as well as the db-name. This :database-id value isn’t returned by (d/db ,,,) when running in the cloud AFAICT. Can someone confirm this? I’m looking to get a globally unique, consistent value for the database across connections. Alternative ideas welcome.


You mean, two databases are equal when they have equivalent facts?


I've used db values themselves as keys, although there could be two equivalent databases.


And just in case, since I'm trying to get my company to open source this library, what are you doing?


Nope, making sure I’m connected to the same database I thought I was before. We churn through databases (and Datomic Cloud stacks, for that matter) and I want to be able to confirm I’m connected to the same database I was before.


@andreas862 we namespace all the ui artifact keys with :ui/, and then we just use clojure.walk to remove all the keys that (= "ui" (namespace k))


Thanks, I think that will be the simplest solution


fwiw, you could also build a list of all idents the db knows about and and then walk the transaction data to remove anything invalid