Fork me on GitHub
#datomic
<
2019-10-28
>
onetom13:10:11

is it possible to use enums in tuple lookup refs? eg, this works:

(d/entity db [:rule/algo+expr [17592186045417 "XXX"]])
but if i use an ident ("enum") in place of that eid, then i just get nil:
(d/entity db [:rule/algo+expr [:rule.algo/regex "XXX"]])
where
(d/pull db '[*] :rule.algo/regex)
=> #:db{:id 17592186045417, :ident :rule.algo/regex}

marshall14:10:21

Erm. That seems like it should work. Let me look into it

👍 4
onetom15:10:05

in my specific use-case, i think i can use a keyword instead of a ref, but it still looks like a bug and i suspect there are legitimate use-cases which might want to do this

onetom16:10:35

hmmm... im still not sure about how the lookup-ref should look like 😕 im getting this error, when I'm trying to transact:

{:txn/id txn-id
 :txn/matches [[:rule/algo+expr [:regex expr]]]}

Execution error (Exceptions$IllegalArgumentExceptionInfo) at datomic.error/arg (error.clj:79).
:db.error/not-an-entity Unable to resolve entity: [:rule/algo+expr [:regex "XXXX"]] in datom [-9223301668109555930 :txn/matches [:rule/algo+expr [:regex "XXXX"]]]

onetom16:10:26

where :txn/matches is

{:db/ident       :txn/matches
    :db/valueType   :db.type/ref
    :db/cardinality :db.cardinality/many}

onetom16:10:12

lookup ref works when using the datom style:

(tx [[:db/add "x" :txn/id 1]
       [:db/add "x" :txn/matches [:rule/algo+expr [:regex "XXX"]]]])
but fails with :db.error/not-an-entity Unable to resolve entity: :regex when using the entity-map style
(tx [{:txn/id      1
        :txn/matches [:rule/algo+expr [:regex "XXX"]]}])

onetom16:10:26

(at least when the tuple attr's 1st element is a keyword, not a ref)

marshall16:10:07

for the card-many you need an extra [] around it

marshall16:10:01

hm. or maybe not

marshall16:10:25

whats’ the schema definition of :rule/algo+expr

onetom16:10:37

{:db/ident       :rule/algo+expr
    :db/valueType   :db.type/tuple
    :db/tupleAttrs  [:rule/algo :rule/expr]
    :db/unique      :db.unique/identity
    :db/cardinality :db.cardinality/one}

onetom16:10:42

and yes, i've tried with and without and extra bracket and it works both ways when using entity-map style and only works without when using datom-style, which is quite logical

onetom13:10:57

it looks like that :rule.algo/regex is treated just as a scalar (keyword) type

onetom13:10:20

my schema looks like this:

{:db/ident       :rule/algo
    :db/valueType   :db.type/ref
    :db/cardinality :db.cardinality/one}

   {:db/ident :rule.algo/regex}
   {:db/ident :rule.algo/substr}

   {:db/ident       :rule/expr
    :db/valueType   :db.type/string
    :db/cardinality :db.cardinality/one}

   {:db/ident       :rule/algo+expr
    :db/valueType   :db.type/tuple
    :db/tupleAttrs  [:rule/algo :rule/expr]
    :db/unique      :db.unique/identity
    :db/cardinality :db.cardinality/one}

onetom13:10:50

in the example from the docs (https://docs.datomic.com/on-prem/schema.html#composite-tuples) there is this txn:

[{:reg/course [:course/id "BIO-101"]
  :reg/semester [:semester/year+season [2018 :fall]]
  :reg/student [:student/email ""]}] 
where :fall is one of the tupleAttrs, but its type is just :db.type/keyword
{:db/ident :semester/season
  :db/valueType :db.type/keyword
  :db/cardinality :db.cardinality/one}

onetom14:10:11

the same doc page further down says:

### External keys
All entities in a database have an internal key, the entity id. You can use :db/unique to define an attribute to represent an external key.
An entity may have any number of external keys.
External keys must be single attributes, *multi-attribute keys are not supported*.

marshall14:10:17

Well, that's not exactly true anymore bc of tuples

onetom14:10:00

ok, then i understood it correctly

onetom14:10:05

since we are talking about tuples, i've also noticed that datomic-free doesn't support tuple value types. is it going to be updated or tuples are a pro-only feature?

akiel14:10:08

I have asked Cognitect regarding this issue. The answer was that they don’t plan to add features to the free edition at the moment.

👍 8
akiel14:10:20

You can use the Starter Edition, which is also free.

onetom14:10:05

sure, it's just a bit more troublesome to download for a team, which is just about to learn datomic aaand clojure at the same time... from me...

akiel14:10:50

I know and I also don’t like it. It would be good to write a mail to <mailto:[email protected]|[email protected]>, explaining your situation. Doing so may help to change things.

onetom15:10:10

What would you propose as an alternative? I'm not sure how could the situation be improved. It's an awesome technology, so I understand why Cognitect is keeping it on a short leash... The client lib is downloadable without hassle at least.

onetom15:10:10

I would be happy with the free version too, btw, but since I've diligently read thru the last 3 years of changelogs and learnt about the tuple support, now I want it badly :) But I guess I might just step back a bit and use txn functions to implement composite keys, like 3 years ago...

souenzzo15:10:29

Without free edition makes harder to have awesome tools like https://github.com/vvvvalvalval/datomock and https://github.com/ComputeSoftware/datomic-client-memdb Also harder to convince people to use/learn it. It goes from "way easier to configure then SQL: just add the dependency and use it" to "oh you will need to crete a account, add a custom repo and it's credentials. You cannot commit your credential. Then you will have access to one year of updates"... 😞

onetom15:10:46

Which process - I guess - acts as a filter or throttle and only seriously interested ppl bother with using Datomic

onetom15:10:18

I agree, it's a pity, but I'm still very grateful that Datomic exists at all :)

onetom15:10:04

I was also pleased to see that tools.deps takes the ~/.m2/settings.xml file into account and it's even explained how to separate your login credentials from the per-project maven repo settings in your deps.edn

onetom15:10:20

All this info is a little too scattered and requires a lot of background knowledge and I feel bad about it, because I have to explain all these quirks to my colleagues too. I'm sure they will ask "how am i supposed to discover all this on my own" and they will feel insecure if I have to tell them that they indeed would have a hard time doing this alone...

onetom15:10:36

Im planning to have datascript around too, so they can quickly experiment, but I'm not sure how different is it from Datomic, coz I never used it...

souenzzo15:10:05

Datascript had used datomic-free to check if it implement's some features/behavior close to datomic unfortunately it can't be done with new datomic features... non-free datomic is about kill it's small community 😞 (yes, I as previously peer and now a cloud consumer REALLY sad with it)

onetom15:10:44

why so sad about the cloud version?

souenzzo16:10:06

Despite working in a prime region of a Brazilian capital, I have many issues with my(all available in my region) ISP. I already lost many days of work due no-internet-connection at the beginning of my current project, i used datomic-client-memdb to work offline and datomock to create/reuse test cenarios After the last datomic update, everything was broken. I needed to re-write my test cenarios and I'm unable to work offline Also, moving from datomic-client-memdb to "client proxy", my deftest goes from 0.1ms to 10s. "run all tests" from 1m to 10m (and it FAIL when my internet goes down)

cjsauer18:10:15

@onetom have you researched datahike? This might be a good middle-ground for your students. https://github.com/replikativ/datahike I’ve been considering switching to it myself for all the same pains that @U2J4FRT2T is feeling, and the fact that it’s open source. Datomic appears entirely uninterested in fostering a community, and so my long-term bet is on something like datahike.

👍 4
onetom18:10:03

No, I have not encountered datahike yet. Thx for putting it on my radar! I also have issues with my internet connectivity (I live on Lamma island in Hong Kong and only get a 0.5-3Mbit/s usually)...

👍 4
akiel14:10:58

This issue is also known. The last update to the free edition is about one year old.

onetom14:10:20

@zach.charlop.powers @alexmiller u were talking about data modeling the other day. what's your take on Hodur? https://www.youtube.com/watch?v=EDojA_fahvM&amp;t=1120s

Alex Miller (Clojure team)14:10:27

sorry, don't know anything about it

onetom14:10:28

regardless, thank you for strangeloop! i've learnt immense amounts from it.

zachcp14:10:25

I haven’t used it but I’ll take a look. thanks @onetom

cjsauer15:10:31

Is there a way to bind a whole datom to a logic variable in a query? Something similar to :as, e.g. :where [[?e ?a ?v :as ?datom]]. I’m looking for an alternative to d/filter given its unavailability in Cloud, and am thinking that I could use rules in order to simulate its effect.

souenzzo15:10:41

[(tuple ?a ?b ?c) ?datom] [(valid? $ ?a ?b ?c)] Not sure about performance

cjsauer15:10:25

Ah tuple, I kept trying to destructure with []. Still running into this tho:

"[?a ?v ?e] not bound in expression clause: [(tuple ?e ?a ?v) ?datom]"

souenzzo15:10:33

tuple is a new feature from datomic. from the last release

souenzzo15:10:50

[(valid? $ ?datom)] *

cjsauer16:10:24

Thanks @U2J4FRT2T, I got a few queries working. You can actually bind the tuple components first before aggregating them, which allows for unification to work in both directions, e.g.

:in $ % ?user
:where
[?e ?a ?v ?tx ?op]
[(tuple ?e ?a ?v ?tx ?op) ?datom]
(authorized? ?datom ?user)
Performance is probably less than ideal tho, like you mentioned.

souenzzo17:10:58

you can use [?a :db/ident] to avoid "full db scan" error

👍 4
cjsauer16:10:34

Related to the above, how big of a bribe is required to get d/filter support in Cloud? 😜 Would be such an amazing way to handle authorization in Ion applications. I can imagine filtering the database per user request based on some authorization rules, which would prevent one from needing to enforce those rules ad-hoc all over the system.

👍 8
onetom18:10:25

if i have card-many attribute, how can i constrain my results based on its cardinality? (something like the HAVING clause in SQL)? the one stackoverflow article i found on this topic recommends nested queries

(d/q '[:find [(pull ?e [* {:txn/matches [*]}]) ...]
              :with ?e
              :where
              [?e :txn/matches ?m]
              [(count ?m) ?matches]
              [(< 1 ?matches)]])

favila19:10:50

another option is d/datoms with a bounded-count for this simple case, anything harder needs a subquery because you cannot perform aggregation before the :find stage