Fork me on GitHub

@kenbier: not sure if that's what you imply, but you cannot set the t in datoms


you cannot set anything in datoms. it is possible, however, to set the t when you assert a datom, provided the t is higher than the previous t asserted to the database.


that feature can be used to construct a database with a "fake" history, i.e. inserting events in the past.


@val_waeselynck: @hans to clarify, i transact something to the database. i then get the t from the database value after that transaction (-> db-after d/basis-t /t->tx). Then i store that value in another transaction i.e. [:db/add :foo/as-of-point some-tx-id]


i was just curious if i should store t or tx-id. i ended up doing neither, as the requirements for some feature changed so i didnt care about the old value. though there doesnt seem anything wrong about it, unless you are planning to join on many foo’s.


perhaps its bad practice?


there is nothing wrong with referring to a transaction entity from another entity per se. they're just entities.


@kenbier you'd store the tx-id as a :db.type/ref. that's part of their intended design


the t value is outside of the reified transaction entity data; it's place separate to the datom indexes, in the transaction log


Agreed, I do this, and I refer to the tx-id


Is it possible to obtain the last time an entity was referenced from another transaction? I have a component representing an API key, and I'm storing references to that API key for all transactions represent API calls. I'm trying to efficiently query for the last time an API key was used without using an explicitly "last-used" field on the API key component.

Ben Kamphaus16:07:53

@michaeldrogalis: apply max on (map :tx (datoms db :vaet ent-id)) maybe? (brainstormed, not attempted)


@bkamphaus: Unfortunately that pulls all transactions. Doing an aggregation across all API calls will be linear time. Can't see a reason not to get this one done in constant time.

Ben Kamphaus16:07:17

ah, didn’t read the constraint carefully enough, yes, linear scan for all transactions referring to that entity. But I think you’re looking at a scan through the log until first instance as the alternative. At least, I think for constant/log time in all components you’d need a subindex of time on references or references on time, which aren’t provided by Datomic. Maybe not thinking about it correctly.


Came to the same conclusion if I wanted to not maintain a timestamp myself. The alternative is keeping a timestamp on the API key component itself, and bumping that ts on each call.


Then I can just get the latest entity for the API key and get the ts.

Ben Kamphaus16:07:42

Yeah, I think it’s reasonable to explicitly annotate the timestamp for last use of the API key as an attribute on the key if you’re expecting the query on ref index and/or log to touch too many datoms on the linear portion and be a drag on the system.


Okie dokie. Thanks for the confirmation @bkamphaus 🙂


@michaeldrogalis: I'm certain to be misunderstanding something here, so if you don't mind indulging me - if you're tagging your transactions with a ref to your API key entity, would something like [:find (max ?used-t) :in $ ?key :where [?tx :tx/key ?key] [?tx :db/txInstant ?used-t]] work?


I'm doing something similar to this to generate last-modified times for a sitemap


@bhagany: That works, yes. But if you have 500 billion API calls, do you really want to query for the max date each time the user accesses an API page? 🙂


Bit of an exaggeration, but you get the point.


ah, okay. I read you as trying to avoid the full log scan error. thanks 🙂


No worries, thanks for pitching in ^^