This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # announcements (5)
- # babashka (105)
- # beginners (92)
- # calva (77)
- # cider (17)
- # cljdoc (8)
- # cljs-dev (8)
- # cljsrn (8)
- # clojure (274)
- # clojure-dev (25)
- # clojure-europe (5)
- # clojure-italy (6)
- # clojure-nl (7)
- # clojure-norway (3)
- # clojure-uk (108)
- # clojurescript (330)
- # code-reviews (4)
- # cursive (6)
- # datomic (37)
- # duct (5)
- # emacs (14)
- # fulcro (23)
- # graphql (1)
- # juxt (1)
- # kaocha (2)
- # leiningen (10)
- # malli (9)
- # music (1)
- # nrepl (12)
- # pathom (21)
- # pedestal (2)
- # planck (4)
- # quil (3)
- # reitit (29)
- # rewrite-clj (10)
- # shadow-cljs (82)
- # spacemacs (29)
- # sql (6)
- # tools-deps (19)
do i see it correctly that the on-prem datomic doesn't provide nested queries, via a bulit-in
q query function?
the could version's documentation mentions this feature at:
(d/q '[:find ?track ?name ?duration :where [(q '[:find (min ?duration) :where [_ :track/duration ?duration]] $) [[?duration]]] [?track :track/duration ?duration] [?track :track/name ?name]] db)
ah, nvm, i haven't realized that i have to quote the inner
q's query parameter too.
here is the most minimal example i could come up with (which works on an empty db too):
(d/q '[:find (pull ?e [*]) :where [(q '[:find ?x . :where [(ground :db/doc) ?x]]) ?x] [?e :db/ident ?x]] (d/db conn))
so this built-in
q function is not in the on-prem docs.
it should come after https://docs.datomic.com/on-prem/query.html#missing to be consistent with the cloud docs.
where can i report such documentation issues?
You can use
datomic.api/q within a query. It’s not a “built-in” function, but you can use it like you can use any function on your class path.
Query forms are evaluated as if in an environment with
(require '[datomic.api :as d :refer [db q])
ah, i see! so, in the cloud version's doc it's important to highlight this, since in such a setup, the query is not running in the app's process?
yes; you have no control over requires or ns aliases in the cloud whereas you do on on-prem. Although even in cloud I think it will auto-require fully qualified ns vars, so you can add custom functions to the classpath? I know this happens for transactions, not sure for queries
Guys, can I connect to datomic cloud from multiple services via
? Currently when my one service is connected to datomic another one cannot
[org.eclipse.jetty.client.HttpClientTransport:149] - Could not connect to HttpDestination[
it's also a Solo topology, if it makes difference (don't see anything about that in the documenation)
doesn't sound like a datomic related issue to me.
can you try to just directly access that endpoint with netcat from the same machine where that "other service" can not access it from?
Hey. How can I limit the number of nested results using pull? I have 3 enteties, each one with
:db.cardinality/many attribute. When pulling from Datomic I never get results because the enteties have relations to each other and i assume its trapped in an infinity loop. I am only interested in the the first line of relations.
Found the solution to my answer here: https://docs.datomic.com/cloud/query/query-pull.html#orga9eca04
You can have pull syntax that recurs only as much as you need. So you might have my-entity-pull-1 that refers to my-entity-pull-2, that refers to my-entity-pull-3. Here my-entity-pull-3 would only have non-references in it. That's how I've limited the recursion, for 'my-entity' in this case.
hi, I'm having some issues using datomic with core.async. I have the following code:
(let [in (async/chan 200) out (async/chan 200)] (async/pipeline 4 out (map compute-metrics) in) (async/go (doseq [item items] (async/>! in item))) (async/go-loop  (println (async/<! out)) (recur)))
compute-metricsfunction, basically saves an item into datomic (after performing a simple computation on one field). I am using the
client.api.asyncto save the item. It seems to work just fine if the
parallelismparameter is lower than 5 [for a 120 itens on my input list] but higher than that it only stuck after computing the first 8 items.
there is actually a problem with
pipeline that it uses a blocking op inside a go block that I just fixed this week (not yet released) but the fix basically makes it work like
ahn, ok! I was fighting with this problem the whole day rsrrs at least I learned a lot about async processes
I am also working on a way to detect this sort of thing in core.async (which is how I found the bug in the first place)
no, I don't think that would have helped here. really, if you're using the async api, you should be able to use pipeline-async I think
I see, but when I take a connection from the channel returned by
(d-async/connect) it has different properties than the sync version? I could not find much info about the distinction of these two libraries to be honest