This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2022-06-10
Channels
- # announcements (2)
- # aws (11)
- # babashka (11)
- # beginners (22)
- # calva (28)
- # cider (1)
- # clj-kondo (3)
- # clojars (14)
- # clojure (41)
- # clojure-europe (45)
- # clojure-norway (1)
- # clojure-uk (2)
- # clojured (31)
- # clojureindia (4)
- # cursive (5)
- # datahike (15)
- # datomic (11)
- # deps-new (11)
- # events (1)
- # holy-lambda (19)
- # introduce-yourself (1)
- # minecraft (17)
- # music (1)
- # nbb (3)
- # off-topic (37)
- # reagent (6)
- # reveal (3)
- # shadow-cljs (46)
- # tools-deps (8)
- # xtdb (22)
My datahike db got corrupted. Reads work. But any sort of write fails. Does it make sense to upload it somewhere?
oh, I am sorry to hear that. I guess it can happen if a shutdown was not properly done. what backend are you using?
@UCSJVFV35 I've had that happen a few times.
The a while later a subsequent triggers a flush and non-supported type breaks the flush.
My fix: Export your datoms. Remove the non-supported types. Vectors, maps and ratios where the offenders in my case. Then import the db again.
I use the filestore backend.. The schema defines datatypes for all my "tables". It is about 6000 invoices, 20.000 lineitemes in all invoices, and 5 other tables with max 2000 items in each.
I did export the data, and then imported it (had to remove all schema definitions first)
I tried to change to import to the latest version, but there are some schema types that seem to be no longer supported.
Hey @UCSJVFV35. Currently there are some problems when updating datahike. @UB95JRKM3 is working on it. Can you continue running on the version you were on before?