Fork me on GitHub
#datomic
<
2019-06-17
>
marshall12:06:28

@lboliveira https://docs.datomic.com/on-prem/excision.html#performance you should not excise more than a few thousand datoms at the same time (max)

marshall12:06:10

it is possible that your excision job will complete, but it is also possible that it may be too large 5000 * 200 is a very very large excision

marshall12:06:50

generally, your options are to provide the transactor a large amount of memory and CPU and wait, try running the excision locally (i.e. on a restored backup) then backing and restore from there, or restore back to a state prior to issuing the excision and try to run it in MUCH smaller pieces

lboliveira13:06:08

@marshall Thanks for your answer. I agree that was sent too many datoms. The backup and restore operations are already taking too long to complete. What do you think about d/log -> d/tx-range -> filter -> d/transact strategy?

marshall13:06:48

it is a reasonable approach for rebuilding a database it can take a significant amount of time as well also, there are a few details that need to be managed carefully, particularly mapping entity IDs when doing that process

lboliveira13:06:25

particularly mapping entity IDs when doing that process Is there any reference about this?

marshall13:06:50

i believe there are some community implementations of the process, but no, not specifically covered in the official docs

lboliveira13:06:21

Thank you. I think will rebuild the database. It is not likely that the index will be rebuilt first. We also need to excise 48M more entities. Do you believe that we could do it locally in a timely fashion using fewer datoms per transaction?

marshall13:06:54

probably not; with that many datoms you’re almost certainly better off rebuilding with a filter

marshall13:06:57

how big is the total db?

lboliveira13:06:07

one moment…

marshall13:06:10

and you want to excise 48M of them?

marshall13:06:10

you’ll definitely be better off transacting the ones you want to keep into a new empty DB