This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-11-08
Channels
- # aleph (9)
- # announcements (42)
- # babashka (13)
- # babashka-sci-dev (9)
- # beginners (38)
- # biff (1)
- # calva (4)
- # cider (5)
- # clj-kondo (39)
- # cljdoc (4)
- # cljsrn (3)
- # clojure (93)
- # clojure-bay-area (1)
- # clojure-czech (1)
- # clojure-dev (4)
- # clojure-europe (65)
- # clojure-finland (3)
- # clojure-nl (2)
- # clojure-norway (7)
- # clojure-portugal (1)
- # clojure-uk (2)
- # clojurescript (73)
- # cloverage (1)
- # cursive (5)
- # data-science (1)
- # datahike (22)
- # emacs (51)
- # graalvm (6)
- # introduce-yourself (8)
- # jobs-discuss (14)
- # kaocha (6)
- # mount (5)
- # nbb (19)
- # off-topic (19)
- # reagent (5)
- # releases (1)
- # sci (19)
- # scittle (4)
- # shadow-cljs (6)
- # tools-deps (9)
- # xtdb (2)
I've run into a perplexing issue with datahike trying to transact some new attributes, specifically a composite tuple for uniqueness. I was getting a schema error telling me I'm missing certain required keys on the attribute (`:db/ident`, :db/valueType
, or :db/cardinality
) even though those keys were indeed present. This was on a test db so I deleted the whole db and re-created it and now I can't get any attributes to add. I even copied the basic example from the repo and it's failing with the same error (missing required keys). Help?
Using the latest datahike 0.5.1517
. I recently upgrade from 0.5.1516
, so I tried downgrading again but still getting the same issue
(d/transact (db-conn) [{:db/ident :test/name
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one}
{:db/ident :test/age
:db/valueType :db.type/long
:db/cardinality :db.cardinality/one}])
(this was copied from the datahike github README)This fails with:
1. Unhandled clojure.lang.ExceptionInfo
Incomplete schema transaction attributes, expected :db/ident, :db/valueType,
:db/cardinality
{:error :transact/schema,
:entity
#:db{:ident :test/name,
:valueType :db.type/string,
:cardinality :db.cardinality/one,
:id nil}}
transaction.cljc: 519 datahike.db.transaction$check_schema_update/invokeStatic
transaction.cljc: 505 datahike.db.transaction$check_schema_update/invoke
transaction.cljc: 545 datahike.db.transaction$entity_map__GT_op_vec/invokeStatic
transaction.cljc: 522 datahike.db.transaction$entity_map__GT_op_vec/invoke
transaction.cljc: 774 datahike.db.transaction$transact_tx_data/invokeStatic
transaction.cljc: 738 datahike.db.transaction$transact_tx_data/invoke
core.cljc: 133 datahike.core$with/invokeStatic
core.cljc: 126 datahike.core$with/invoke
core.cljc: 216 datahike.core$_transact_BANG_$fn__39462/invoke
Atom.java: 37 clojure.lang.Atom/swap
core.clj: 2369 clojure.core/swap!
core.clj: 2362 clojure.core/swap!
core.cljc: 215 datahike.core$_transact_BANG_/invokeStatic
core.cljc: 212 datahike.core$_transact_BANG_/invoke
core.cljc: 233 datahike.core$transact_BANG_/invokeStatic
core.cljc: 229 datahike.core$transact_BANG_/invoke
core.cljc: 331 datahike.core$transact/invokeStatic
core.cljc: 324 datahike.core$transact/invoke
Var.java: 393 clojure.lang.Var/invoke
connector.cljc: 25 datahike.connector$update_and_flush_db/invokeStatic
connector.cljc: 24 datahike.connector$update_and_flush_db/invoke
transactor.cljc: 34 datahike.transactor$create_rx_thread$fn__39598$fn__39600/invoke
Feels like I'm taking crazy pills. I was adding attributes to my schema yesterday without issue
I've dropped the db and re-created it several times
Just for sanity sake, when i eval the connection (db-conn)
I get #<Atom@4cd53a12: #datahike/DB {:max-tx 536870912 :max-eid 0}>
This has to be some kind of environment issue. I just copied the whole README and I'm getting the same error (using the :file
backend)
If I clear the io/replikativ
dir from ~/.m2/repository
and open a repl separate from emacs with clj -Sforce
I can use the README example without issue. But pulling it into emacs and exec'ing from my cider repl there I get this issue. Cannot for the life of me understand why
narrowed things down a bit, but still pretty stumped. If I open a repl oustide emacs (e.g. no nrepl or cider) and I don't require my repl ns, I have transacted all the current attributes. But if I require my repl ns (which fires everything up with integrant) all of a sudden I can't transact some of the attributes. Utterly baffled
What started all this was trying to transact an entity with a unique composite tuple:
[{:db/ident :myapp/a
:db/valueType :db.type/ref
:db/cardinality :db.cardinality/one
:db/isComponent true}
{:db/ident :myapp/b
:db/valueType :db.type/string
:db/cardinality :db.cardinality/one}
{:db/ident :myapp/a+b
:db/valueType :db.type/tuple
:db/tupleAttrs [:myapp/a :myapp/b]
:db/cardinality :db.cardinality/one
:db/unique :db.unique/identity}]
as I write that out, I wonder if you can't do composite tuples where either referenced attr is a ref type?
except I was able to transact that into the schema, just was a major pain to do it outside my regular repl flow (e.g. spit/slurp db config and schema attrs to the filesystem to read into the other repl)
I've just tested it and the uniqueness constraint works as expected
Sorry for getting back to you late @U5RFD1733. That sounds very strange, have you made further progress in pinning the issue down?
No progress. I wrote a migrate.clj file that I invoke with clj -M migrate.clj. But even that file I have to filter out the composite attribute else it crashes. It’s enough of a workaround for now. Since somehow I was able to get that attribute transacted into both dev and test dbs earlier
@U0DP57ZT9 Do you happen to have an idea of what might be going on? I would need to look into how composite tuples exactly work first.
My best guess is some issue serializing the attributes from emacs -> cider -> nrepl. If I open a plain repl with clj and type out all the requires and transact the attributes from an edn file, I don’t have any issues. Possible something in my user ns is mucking something up, but bisecting through that makes me wanna claw my eyes out currently.
Once I got the composite attr transacted, the composite index works exactly as expected, so that’s super good news
Haven’t pushed that stuff to prod yet, so not sure if I’ll run into a migration issue there