Fork me on GitHub
#datomic
<
2019-01-29
>
stijn09:01:55

anyone seen this and knows what the meaning / cause of this error is? (datomic cloud)

stijn09:01:59

{:type clojure.lang.ExceptionInfo
   :message Next offset 3000 precedes current offset 2000
   :data {:datomic.client-spi/request-id 8e372a82-abc5-4a52-805e-fe90786c82f5, :cognitect.anomalies/category :cognitect.anomalies/fault, :cognitect.anomalies/message Next offset 3000 precedes current offset 2000, :dbs [{:database-id fa80ec7a-d124-4cf2-971e-e43c8d7e8516, :t 1885, :next-t 1886, :history false}]}
   :at [datomic.client.api.async$ares invokeStatic async.clj 56]}

joshkh11:01:56

just curious what's going on here. 🙂 when i run my (datomic) clojure project i consistently get messages that some of my deps are being downloaded from datomic's s3 releases bucket. always the same ones.

$ clj -Stree
Downloading: com/amazonaws/aws-java-sdk-cloudwatch/maven-metadata.xml from 
Downloading: com/amazonaws/aws-java-sdk-dynamodb/maven-metadata.xml from 
Downloading: com/amazonaws/aws-java-sdk-kinesis/maven-metadata.xml from 
Downloading: com/amazonaws/amazon-kinesis-client/maven-metadata.xml from 

alexmiller12:01:37

Those are the version metadata files for each of the artifacts. You should need to download them any time there are new releases (which I’d guess are about weekly right now) but they should be cached in your ~/.m2/repository

alexmiller12:01:17

Any chance that’s getting cleaned between builds?

alexmiller12:01:29

Classpaths get cached in your local dir under ./.cpcache too - that should be in front of the m2 stuff assuming your deps.edn isnt getting updated

joshkh14:01:07

interesting. if i don't update deps.edn then i can see the cache at work: when i run clj -Stree for a second time there are no downloads. however, launching to Ions still triggers the download process.

joshkh11:01:19

i'm guessing it's because i've included the aws jdk, amazonica, and ions, and that there's a version mismatch. might this affect the size of my ions push to code deploy? i've seen a big increase in the overall time to deploy.

alexmiller12:01:50

Might be something with that, prob something the Datomic team could answer better than I

joshkh13:01:37

thanks, alex. in the mean time i'll play around with some exclusions and see where i end up.

Per Weijnitz13:01:45

Hi! I've recently started with Datomic, so please bear with me. Is there a way to perform a full scan of the database (with the purpose of studying the internals while learning)? I've tried things like (d/q '[:find ?e ?a ?v ?tx :where [?e ?a ?v ?tx]] (get-db)) but Datomic refuses with :db.error/insufficient-binding Insufficient binding of db clause: [?e ?a ?v ?tx] would cause full scan.

joshkh13:01:42

a few people have been looking into this for the purpose of cloning a datomic (cloud) db, at least until Cognitect provides official support (please please please). if you're interested in the inner workings and mappings then maybe d/datoms might be of interest? (seq (d/datoms (client/db) {:index :eavt}))

joshkh13:01:29

even if you could query for everything, i think it would timeout

Per Weijnitz15:01:31

@U0GC1C09L d/datoms looks very useful to me in my studies, thanks! Let's hope Cognitect adds support for full scan soon!

joshkh15:01:47

no problem! full scan probably isn't what we want 🙂 maybe a clever way to copy over tables and s3 buckets (although i appreciate it's not that easy). just curious, are you using datomic cloud?

souenzzo17:01:01

add [?a :db/ident ?ident] then you can query

Per Weijnitz08:01:50

@U2J4FRT2T @U09R86PA4 Thanks for the advice! I'll dig into it today and see what I can learn.

Per Weijnitz08:01:02

@U0GC1C09L I see, that seems practical indeed. No, I use on-prem with the dev backend. Hmm... that makes me think. Did you ask this because it may be possible inspect the datoms directly by inspecting the database backend? (postgres table contents for example)

Per Weijnitz09:01:50

@U2J4FRT2T That does indeed work!

parrot 5
souenzzo11:01:45

It's like - "I can't query all data" - "Can you query only valid data?" - "Sure I can!"

eraserhd17:01:08

Is there a protocol I can implement to be able to use a data structure with d/q?

benoit17:01:38

You should be able to pass any collection or relations to d/q. (d/q '[:find ?b :in $ ?a :where [?a :a ?b]] [['a :a 'b]] 'a)

eraserhd18:01:38

benoit: I know about that, but this is a Clara Rules session. I could pass the results of a query, I think, but the fact that d/q supports databases and vectors suggested to me that maybe there's a protocol that I can implement.

👍 5
souenzzo11:01:20

I'm interested on that too

souenzzo12:01:07

I think that #datascript may be a better solution to make it.

crowl19:01:25

Hi, can I pull many entities in one query with the datomic client api?

joshkh20:01:22

hey @U44C8GM7T, how do you mean? the client api returns as many different entities as matched in your :where clause. can you elaborate?

joshkh20:01:17

without knowing more this might not answer your question, but here's an example of pulling many (all) "item" entities that have a sku and are on sale, and returns all of the entities' attributes:

(d/q '{:find  [(pull ?item [*])]
       :in    [$]
       :where [
               [?item :item/sku _]
               [?item :item/on-sale? true]
               ]}
     db)