Fork me on GitHub

What do you mean? You can alter the schema in Datomic


omg. I have to change the way I think after relational DBs


So if I'm periodically scanning a collection of processes, and each process has environment variables, which change over time, I have two options then. 1. It can be [<process-id> :process/LOCALE "UTF-8"] in case of datomic, though, I would have to install new attribute definitions on the go on every update, and remove old ones. In case of datascript, it just works. I guess I need a DB function to retract old keys and put it new keys on each update. Also this type of attribute storage does not seem to be affecting performance in any way - it seems I can efficiently query all the keys of all the processes or current process with the range index scan. 2. [.. :process.env/key ..] and [.. :process.env/value "UTF-8"] DB Schema in this sort of storage is completely static - key is string, value is string. In this case there is a question - do I need to add composite unique keys into schema? Quick googling seems to show that this concept is a problem even in full-version datomic. Official documentation suggests - if I understand it correctly - installing custom transaction function that will implement such 'composite unique key' constraint, retracting old values on upsert. This will not, however, autoretract older keys/values that were removed after last update. - It might be still a solution to find the latest transaction number that asserted entire fresh key/value set, and use this transaction number as a filter to get latest collection - But I guess I need to think of datomic entities during update as a diff between new and old states, in terms of which datoms should I assert and which retract, right? This is different then doing upsert into a JSON field of a Postgres column. Then I have to use/write database function that will retract all associated process.env/keys and values first. (Maybe it's one short query inside transaction, must try to do it)


In original postgres I was using JSON field in a process table - and it supports both merging and replacing, with replacing a default operation on row upsert or column update. Once you update process's configuration - you retract old configuration keys/values and assert new ones. That's what I need, thing that behave like internal JSON in dato*. Need to test components in full datomic.


(meta) Also looking at how iPad slack client receives my chat messages, with all the edits applied live in place with animations as it seemingly loads the edits from server continuously, I see it gets stuck after some 8-10 edits and didn't show me latest version until I restarted slack Ok, now it's obviously a bug - no amount of slack restarting on ios can load a message correctly that had >10 edits)


I'd go with #2, and add a composite key that you specify as unique, keeps it simple and is effective


can you point me at a solution on how to add composite key?


Well, that part would just be :process.env/uni-key (str proc-id "/" env-key)


it duplicates data though


Yeah it only serves for the purpose to keep it unique


If you are on datomic and want to do it properly you can just create a db function.


But if you want to do a quick and dirty (and working) solution... then the above is just fine IMO.


yeah, that might work)


@leov composite keys are pain in the ass for us too


just spent last week working with them


hard to query, hard to update


one option is to completely remove old entity and add new one with new set of ENV props. That’ll save you diffing


another option is to serialize entire env map and store it in string attr. But it’ll only work if you don’t plan to query based on values in that map


again, that’ll make updates simple


I wouldn’t recommend dynamically alter Datomic shema


eps. when amount of attributes is potentially unbound


In fact, smth like :process.env/uni-key (str proc-id "/" env-key) makes finding values easier


we’ll probably adopt this approach :)


about Slack update: listening to, they say they try the best they can, bringing server updates, but if it’s impossible, they don’t do anything about it


“best effort” approach :)