Fork me on GitHub
#datahike
<
2022-06-10
>
awb9900:06:54

My datahike db got corrupted. Reads work. But any sort of write fails. Does it make sense to upload it somewhere?

awb9904:06:42

This happened after 2 months running in production. No change in code.

awb9904:06:06

App didnt crash, but might be that some shutdown was not clean.

awb9904:06:13

Any ideas why this could happen?

timo07:06:41

oh, I am sorry to hear that. I guess it can happen if a shutdown was not properly done. what backend are you using?

kkuehne07:06:08

What kind of schema are you using and how much data is involved?

alekcz10:06:04

@UCSJVFV35 I've had that happen a few times.

alekcz10:06:39

In my cases I dropped a non-supported type into the db.

alekcz10:06:42

The a while later a subsequent triggers a flush and non-supported type breaks the flush.

alekcz10:06:00

My fix: Export your datoms. Remove the non-supported types. Vectors, maps and ratios where the offenders in my case. Then import the db again.

👍 1
awb9904:06:08

I use the filestore backend.. The schema defines datatypes for all my "tables". It is about 6000 invoices, 20.000 lineitemes in all invoices, and 5 other tables with max 2000 items in each.

awb9904:06:50

I am doing inserts once an hour in a scheduled import.

awb9904:06:18

I did export the data, and then imported it (had to remove all schema definitions first)

awb9904:06:42

I tried to change to import to the latest version, but there are some schema types that seem to be no longer supported.

timo06:06:40

Hey @UCSJVFV35. Currently there are some problems when updating datahike. @UB95JRKM3 is working on it. Can you continue running on the version you were on before?