Fork me on GitHub

Quick question, which I think I know the answer to but want to sanity check.


We’re using Datomic with a DynamoDB back-end, I was just looking at AWS Quicksight, and I imagine that even though we’re using DynamoDB as the store I imagine we would’t be able to just transact directly from DynamoDb into QuickSight because Datomic would obfuscate the data in Dynamo somehow. Is that correct?


xsyn: That's accurate. DynamoDB is just the backing store. The data itself is encoded by Datomic.


Thanks very much


Need to get generators working, pull syntax, and fix some stuff up around rule-vars


But it's a start


i wonder if Cognitect will be doing the same officially


Not sure if anyone else has started on it yet


query generators is a fascinating idea 🙂


I'm impressed with how easy it is to translate the grammar from to spec.


My goal here is to be able to test.check some query-generating code I'm going to write

Ben Kamphaus14:05:14

No promises about the future. 🙂 Though the generic background context is - as with most companies -Cognitect designs and ships things that solve problems we have or our customers have.

Ben Kamphaus14:05:08

You can typically assume that Datomic will take advantage of new Clojure language features (with Clojure version adoption lag) and that Datomic itself will, from time to time, motivate new language features in Clojure.


@bkamphaus: Can you elaborate on the specific problems that drove you to spec?


@potetm: The motivations for clojure.spec are described in the guide


Yeah I was curious whether there were specific projects you guys had that pushed this to the fore.


So I guess when I said "problems" I meant "projects" 🙂


(I know there's only so much you can/will divulge. I was just curious, so I thought I'd ask.)


Is it true that transactions should not have more than one million datoms in them?

Ben Kamphaus20:05:10

@currentoor: “should not have” is a pretty strong phasing, but it’s likely that transactions of that size are likely to run into performance issues (best performance size is ~40k or so) — do you have transactions of that size that represent an atomic boundary in the domain?


@bkamphaus: not in the domain, I just need to migrate some old data


so this script will just run once and not be userfacing

Ben Kamphaus20:05:13

@currentoor: if its not atomic in the domain I would break it up into smaller transactions and pipeline it, as per the example here:


@bkamphaus: thanks I'll check it out.