Fork me on GitHub

from > Datomic runs queries on the client, not on the server so, is this the equivalent of consumer pulling all recs in all 'tables' listed in query and then querying/filtering against all those recs client-side?


^ second to that, what would be the best hands on way to see this happening (i.e. see all recs on current client)


Hey there, are there any well known examples of code available rest APIs backed by datomic? I am trying to build an app and am having some hurdles. Would love to see an existing example


@val_waeselynck awesome, thanks!


It's more the equivalent of the database server having to have the working set in memory


And in these days, when servers are cheap, why couldn't it just be your app that has that? :)


i'm having trouble retracting an entity using a ref lookup:

(defn unban [userIdentifier]
  (d/transact (connection/connect)
    [[:db.fn/retractEntity [:ban/user userIdentifier]]]))
with :ban/user being defined:
{:db/ident :ban/user
 :db/valueType :db.type/ref
 :db/cardinality :db.cardinality/one
 :db/unique :db.unique/value
 :db/doc "The banned user"}
it appears to work because i see the usual db-before/db-after but the entity is still there when i query with a fresh d/connect any ideas?


@U7Y912XB8 1- do you see the datoms being retracted in the tx-result? 2- what connection to you connect to, and when relative to calling transact?


@val_waeselynck the tx result:

{:status :ready, :val {:db-before datomic.db.Db@2e4a80a8, :db-after datomic.db.Db@937aaeac, :tx-data [#datom[13194139534372 50 #inst "2018-01-26T05:59:57.225-00:00" 13194139534372 true]], :tempids {}}}


(connection/connect) just runs (d/connect) with my db url


and when I query I am doing (d/db (d/connect ...)) to get the latest db


and i'm seeing the correct db-after id when i query again


@U7Y912XB8 this :tx-data shows that the transaction had essentially no effect - no additions nor retractions except for adding a :db/txInstant timestamp to the transaction entity.


Hey! When you say Solo from about 1$ / day, does that include licensing costs?


i believe so @maxt - you can view the details of the price calculator, and see it split between vendor and AWS


Thank you, I hadn't seen the calculator. Seems to be $ 1/3 for the license and $2/3 AWS for minimal Solo.


The calculator acts a little funny though, I get the same or lower quote if I change to the production fullfillment.


@maxt the AWS calculator design predates CloudFormation templates, so it does funny things. You should drive by the instance types, not the fulfillment


t2.small is only for Solo, and i3.large is only for Production


@maxt there are no “licensing costs” with Cloud, just usage markup on EC2 instances. That is how the AWS marketplace works. Solo is about $1/day, total


@stuarthalloway Wonderful, thank you!


in cloud, when calling (client-api/delete-database client {:db-name "<my-db>"}) is the db’d data being excised?


as in “all resources reclaimed” — albeit not necessarily immediately


@stuarthalloway will the “deleted” data eventually be overwritten on disk or does it stay in tact?


Datomic calls delete operations on the underlying stores: DDB, S3, and EFS


not sure if intentional, but just a heads up that this section doesn’t have a link at the top like the others


(catch ExceptionInfo t
            (if (= (:cognitect.anomalies/category (ex-data t)) :cognitect.anomalies/conflict)
anomalies is coming to peer API?


I just wanted to take a moment to say thanks to @val_waeselynck for his Datomock library ( It's been a massive improvement to my team's dev workflow, making it completely trivial to always work against the latest production data with complete confidence that we're not going to break anything. IMHO, Datomock is one of the most useful tools to come out of the Clojure community in recent times, and I suggest that anyone using Datomic via the peer API should give serious thought to whether they'd find it useful. And of course, thanks as well to the Datomic team for the fundamental insights that make a tool like Datomock even possible! Speculative transactions are just an amazingly powerful tool.


Please vote here of even better speculative tx's


@U2J4FRT2T already voted for that one, it'd be excellent to have 🙂


Wow, thanks for the kind words @U077BEWNQ 🙂 it's very good to know that other people find it useful. As you said, the credit goes to the authors of Datomic who really, really got the fundamentals right


@val_waeselynck some truth to that, but BOY is it easier to use it in a dev workflow with Datomock added 🙂


In related news : I’m also getting great value out of the scope-capture lib as well. Val, you are doing great work


Bit off, but there is how to connect on "datomic:" from local repl using ssh tunnel or something like?


Is there anything in datomic to grab the 10 biggest values in ~constant-time?


I think the answer is no… but I think in theory it’s possible.


(Also, I’m talking the on-prem)

Ben Kamphaus22:01:18

@potetm depending on the assumptions you can make about the range of values that might be there, you could at least beat the perf of the naive case using index-range or seek-datoms. Real bottleneck you’re up against, or more of a perf golf/curiosity thing?


Perf golf for sure


Thanks @bkamphaus! Hope all is well with you!