This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-05-23
Channels
- # announcements (12)
- # beginners (225)
- # calva (7)
- # cider (45)
- # clj-kondo (1)
- # cljdoc (1)
- # cljsrn (3)
- # clojure (112)
- # clojure-dev (45)
- # clojure-europe (6)
- # clojure-finland (2)
- # clojure-india (1)
- # clojure-nl (27)
- # clojure-spec (37)
- # clojure-uk (171)
- # clojurescript (39)
- # core-async (9)
- # cursive (22)
- # datascript (8)
- # datomic (50)
- # emacs (12)
- # figwheel-main (17)
- # fulcro (42)
- # garden (2)
- # hoplon (27)
- # jobs (4)
- # kaocha (8)
- # klipse (2)
- # luminus (2)
- # off-topic (9)
- # perun (33)
- # planck (2)
- # re-frame (9)
- # reagent (48)
- # reitit (5)
- # remote-jobs (1)
- # rum (2)
- # shadow-cljs (23)
- # slack-help (3)
- # spacemacs (18)
- # sql (7)
- # tools-deps (24)
- # unrepl (9)
- # vim (30)
How fast/slow is datomic.api/as-of
supposed to be? What affects its performance when d/pull
is used?
Without as-of
(using the current database) my d/pull
ing of 6K entities takes about 10 secs.
With as-of
, it takes about 3 minutes.
Is this expected, or does this indicate something else is wrong (too little memory?)?
Thanks
@ivar.refsdal Is this for On-Prem? I don’t have any experience with On-Prem specifically but I can imagine the performance here depends on several factors such as memory, caching, client vs peer, number of datoms, etc. Do you have any additional information you can provide?
Thanks for replying! Yes, this is On-Prem and I'm using the peer library. In VisualVM I see that quite some time is being spent inside Fressian/readUTF8Chars (or something like that). Does that mean it is accessing the network? How would I count the total number of datoms? I did
(format "%,3d" (reduce (fn [cnt _] (inc cnt)) 0 (d/datoms (d/history db) :eavt)))
=> "37,605,542"
What is considered a big amount of datoms?
Edit: I'm doing a pull star on the as-of-databases. Would it considerably improve performance if this was narrowed?
Thanks.Fressian/readUTF8Chars is just decoding a string from a block of fressian-encoded values
do you maybe have any very large or numerous string values that are only seen in as-of?
Announcing HTTP Direct for Datomic Cloud http://blog.datomic.com/2019/05/http-direct-for-datomic-cloud.html Check out the interactive tutorial https://docs.datomic.com/cloud/livetutorial/http-direct.html
Thanks! Are there case studies for datomic ions out there?
@marshall FYI it was not clear to me you had to press the spacebar to start the tutorial
hopefully the first of many...
@marshall I see in the latest ion-starter the clojure version was bumped from 1.9 to 1.10. Does that imply datomic cloud now supports clojure 1.10?
is :db/fulltext
supported in Datomic Cloud? i didn't see it in the schema reference docs, but i thought i'd ask just in case.
i think i've seen some on-prem examples where some loop function was used to stream values from the transaction log. is something like that possible with Cloud?
In fact, I am building a pump from the tx-log to ElasticSearch for full-text searching @joshkh
how might that tx-range work efficiently? are you grabbing chunks and storing the latest :t somewhere? wouldn't you have to provide it with new starts and ends?
So you’re going the approach of making the Datom the document instead of reifying the entity into a document?
@joe.lane yeah the advantage of raw datom is that you never have to change your indexing code
if you were making full documents in ES, you'd have to rematerialize and re-index everytime the needs changed
when the needs change, you can change the query side that knows how to turn "leaf" hits into the larger entity
i was working on a little trick to rematerialize entities but without references or components (which can be massive). worked out pretty well when i was manually moving entities to elasticsearch.
i was imagining storing the high water mark in datomic but realised that would cause a loop 😉
Thanks for replying! Yes, this is On-Prem and I'm using the peer library. In VisualVM I see that quite some time is being spent inside Fressian/readUTF8Chars (or something like that). Does that mean it is accessing the network? How would I count the total number of datoms? I did
(format "%,3d" (reduce (fn [cnt _] (inc cnt)) 0 (d/datoms (d/history db) :eavt)))
=> "37,605,542"
What is considered a big amount of datoms?
Edit: I'm doing a pull star on the as-of-databases. Would it considerably improve performance if this was narrowed?
Thanks.