Fork me on GitHub
#datomic
<
2019-10-31
>
onetom05:10:46

do i see it correctly that the on-prem datomic doesn't provide nested queries, via a bulit-in q query function? the could version's documentation mentions this feature at: https://docs.datomic.com/cloud/query/query-data-reference.html#q

(d/q '[:find ?track ?name ?duration
       :where
       [(q '[:find (min ?duration)
             :where [_ :track/duration ?duration]]
           $) [[?duration]]]
       [?track :track/duration ?duration]
       [?track :track/name ?name]]
     db)

onetom05:10:36

ah, nvm, i haven't realized that i have to quote the inner q's query parameter too. here is the most minimal example i could come up with (which works on an empty db too):

(d/q '[:find (pull ?e [*])
         :where
         [(q '[:find ?x . :where [(ground :db/doc) ?x]]) ?x]
         [?e :db/ident ?x]]
       (d/db conn))

onetom05:10:06

so this built-in q function is not in the on-prem docs. it should come after https://docs.datomic.com/on-prem/query.html#missing to be consistent with the cloud docs. where can i report such documentation issues?

csm06:10:53

You can use datomic.api/q within a query. It’s not a “built-in” function, but you can use it like you can use any function on your class path.

favila14:10:40

Query forms are evaluated as if in an environment with (require '[datomic.api :as d :refer [db q])

favila14:10:11

that’s why bare “q” works and seems special

favila14:10:22

It’s really datomic.api/q

onetom15:10:13

ah, i see! so, in the cloud version's doc it's important to highlight this, since in such a setup, the query is not running in the app's process?

favila15:10:21

yes; you have no control over requires or ns aliases in the cloud whereas you do on on-prem. Although even in cloud I think it will auto-require fully qualified ns vars, so you can add custom functions to the classpath? I know this happens for transactions, not sure for queries

Oleh K.16:10:23

Guys, can I connect to datomic cloud from multiple services via .<system>.<aws_zone>. ? Currently when my one service is connected to datomic another one cannot

onetom16:10:02

what is the error message u get?

Oleh K.16:10:27

[org.eclipse.jetty.client.HttpClientTransport:149] - Could not connect to HttpDestination[.<system>.]6e48b9ed,queue=1,pool=DuplexConnectionPool[c=1/64,a=0,i=0]

Oleh K.16:10:04

<system> is a real name

Oleh K.16:10:15

the service is running in the same instance as the main one (in datomic vpc)

Oleh K.17:10:02

it's also a Solo topology, if it makes difference (don't see anything about that in the documenation)

onetom17:10:12

doesn't sound like a datomic related issue to me. can you try to just directly access that endpoint with netcat from the same machine where that "other service" can not access it from? nc entry.<system>. 8182

jherrlin18:10:57

Hey. How can I limit the number of nested results using pull? I have 3 enteties, each one with :db.type/ref / :db.cardinality/many attribute. When pulling from Datomic I never get results because the enteties have relations to each other and i assume its trapped in an infinity loop. I am only interested in the the first line of relations.

jherrlin19:10:54

Hmm it didnt solve my problem. Dont really grasp what it did though

cjmurphy23:10:30

You can have pull syntax that recurs only as much as you need. So you might have my-entity-pull-1 that refers to my-entity-pull-2, that refers to my-entity-pull-3. Here my-entity-pull-3 would only have non-references in it. That's how I've limited the recursion, for 'my-entity' in this case.

bartuka20:10:39

hi, I'm having some issues using datomic with core.async. I have the following code:

(let [in (async/chan 200)
        out (async/chan 200)]
    (async/pipeline 4 out (map compute-metrics) in)
    (async/go (doseq [item items] (async/>! in item)))
    (async/go-loop []
      (println (async/<! out))
      (recur)))
And the compute-metrics function, basically saves an item into datomic (after performing a simple computation on one field). I am using the client.api.async to save the item. It seems to work just fine if the parallelism parameter is lower than 5 [for a 120 itens on my input list] but higher than that it only stuck after computing the first 8 items.

alexmiller20:10:43

can you reproduce if you use pipeline-blocking instead?

bartuka20:10:09

I had the same issue using pipeline-async but haven't tried the blocking version

bartuka20:10:25

I might be able to run it very quickly here, brb

alexmiller20:10:30

I'm certain that the issue is that the go block threads are all blocked

alexmiller20:10:52

so a thread dump would reveal what blocking op they are blocked on

bartuka20:10:33

yes, just worked (Y)

alexmiller20:10:43

there is actually a problem with pipeline that it uses a blocking op inside a go block that I just fixed this week (not yet released) but the fix basically makes it work like pipeline-blocking

bartuka20:10:44

can you help me understand a little better this process?

alexmiller20:10:39

so yeah, this is a bug in core.async that I'll release soon

bartuka20:10:30

ahn, ok! I was fighting with this problem the whole day rsrrs at least I learned a lot about async processes

alexmiller20:10:10

I am also working on a way to detect this sort of thing in core.async (which is how I found the bug in the first place)

bartuka20:10:30

If I used the datomic sync api I would have succeeded too?

alexmiller20:10:06

no, I don't think that would have helped here. really, if you're using the async api, you should be able to use pipeline-async I think

bartuka20:10:16

I see, but when I take a connection from the channel returned by (d-async/connect) it has different properties than the sync version? I could not find much info about the distinction of these two libraries to be honest

alexmiller20:10:41

sorry, I'm not much of an expert on this particular area

bartuka20:10:25

np, thanks for the help.. saved the day o/