Fork me on GitHub

There is an example in the docs under Functional Expressions which says:

;; this query will not work!!!
[:find ?celsius .
 :in ?fahrenheit
 :where [(/ (- ?fahrenheit 32) 1.8) ?celsius]]

;; use multistep instead
[:find ?celsius .
 :in ?fahrenheit
 :where [(- ?fahrenheit 32) ?f-32]
        [(/ ?f-32 1.8) ?celsius]]
On the other hand, the below example works for me. Just wondering why a single lambda is not considered to be a cleaner approach compared to the multi step calculations.
[:find ?celsius
       :in ?fahrenheit
       [(#(/ (- % 32) 1.8) ?fahrenheit) ?celsius]]


I’m kinda surprised the lambda works in datomic cloud with the client api


conceptually, the query is data, not code, and isn’t meant to have eval run on it. There may also be inefficiencies from creating multiple function objects, but I don’t know where it’s created and whether it’s created more than once.


IOW this feels like a bug and you shouldn’t rely on this behavior; it may even be a security hole waiting to happen.


(I am not a Cognitect though)


i've also used the lambda approach in the past, for the same reason, to get around the limitation of recognizing query variables only at the top level of the expression, and not in nested levels.


Are you supposed to transact the schema every time you open the db connection?


No, transact only if something needs to change.

👍 2
Linus Ericsson07:05:29

If you are using on-prem, conformity is good:


i usually have some kind of an ensure-schema function, which speculatively transacts the schema, with d/with and if the :tx-data of its transaction result would have more than 1 datoms, then i actually transact it. it's not the cleanest if you have a cluster of compute instances, but we don't have such complicated needs. i've also generalized this to an arbitrary sequence of transactions, because we also need to ensure some seed data, not just schema attributes.

👍 1