This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2016-09-07
Channels
- # admin-announcements (2)
- # arachne (1)
- # bangalore-clj (2)
- # beginners (39)
- # boot (349)
- # cider (31)
- # clara (2)
- # cljs-dev (9)
- # cljsjs (67)
- # cljsrn (7)
- # clojure (300)
- # clojure-art (4)
- # clojure-greece (11)
- # clojure-hk (3)
- # clojure-israel (1)
- # clojure-italy (17)
- # clojure-japan (1)
- # clojure-russia (33)
- # clojure-sg (2)
- # clojure-spec (41)
- # clojure-uk (86)
- # clojurescript (123)
- # clojurex (3)
- # code-reviews (1)
- # component (6)
- # crypto (1)
- # cursive (36)
- # datomic (32)
- # devcards (3)
- # emacs (11)
- # events (3)
- # funcool (4)
- # luminus (10)
- # om (28)
- # onyx (88)
- # pedestal (2)
- # re-frame (84)
- # reagent (7)
- # ring-swagger (3)
- # specter (33)
- # sql (2)
- # vim (21)
the most recently modified file in the roots
folder will do, @kenny
Hi all 🙂 quick question: when using lookup-ref in parametrised queries, what is the ref resolved against? The most recent available index, or the datasource provided in the query?
@achesnais I experimented a bit, AFAICT it's definitely from a data source
but if there are several data sources not sure what's going on
it seems to me it gets resolved once per datasource used in a Datalog clause involving the entity
In your ‘ambiguous data source’ example, isn’t the error stemming from the fact that only $ can be an implicit datasource, meaning that if you don’t specify it in the :in clause datomic won’t know where to find it?
example 2 is super super interesting. I would have expected it not to work if [:a/id …] were to resolve to the entity within the data source, but it seems what’s binding is indeed the lookup-ref itself
or rather, it seems that the resolution is limited to the clause scope meaning this works because you’re not passing raw id directly
And thanks for taking time to experiment @val_waeselynck
@robert-stuttaford greetings! Do you store datetime/user-id on a "data-entity" or on a transaction data? How does it work for you in datascript (level of convenience)? Currently, I am storing those on data, but feels like a bit dirty. Can you share any insight on whether migrating datetime/user-id to tx-data worth it?
Another question is: is there a recommended approach to describing content sharing? e.g. I am the author of the blog post, and I grant permission to read/modify it to these 3 users
hi @misha - can you describe what you mean by datetime/user-id
? do you mean linking the user who caused a transaction to occur, to that transaction - so, an audit trail?
@robert-stuttaford classic created-by
created-datetime
updated-datetime
in my case, the entity being created/edited is private/individual in a sense of "ownership", where only 2 use cases of transactions are possible:
1. the same user updates his own data.
2. some other user might update the data, if he is a collaborator (this is where my 2nd question origins).
but for the sake of 1st use case only: if I need to mark an entity as belongs to user1
, where do I put :entity/user-id: in entity data, or in tx data?
(from the "needs to be synchronized with datascript a lot" point of view, if that matters)
ok, gotcha. basically, you only need to do something for 'created-by'. the rest is discoverable already
when transacting something you want to track, you can link directly to the in-flight transaction's reified entity:
(d/transact conn [{<your entity here>} [:db/add #db/id[:db.part/tx] :transaction/responsible-user your-logged-in-user-id-here]])
what's nice about this is you can use it for lots of stuff. we do this for all txes performed by a logged in user.
however, you could also just make an attr that links the creating user directly to your entity
(d/transact conn [{<your entity here> :your-entity/created-by your-logged-in-user-id-here}])
@robert-stuttaford so you chose to go with user-id in tx-data? how often do you flush UI (datascript) data to server? Do you accumulate any period of txs at all (e.g. offline usage for minutes/hours)?
yes, on tx
datascript syncs as early and as often as it can, but of course if you're offline for a long time, it'll only sync when it's back on
Nikita wrote a rad bi-di event source sync mechanism that can handle just about any amount of data, and batches events, etc
@kenny You can also use list-backups http://docs.datomic.com/backup.html#listing-backups
@robert-stuttaford but you keep individual tx-data intact, and send it to datomic?
or attaching tx-data with user-id/date happens in back-end only, and datascript just sends something like data + cookies
to infer the tx-data for datomic upon sync payload arrival?
I'd like to know if constructing tx-data on datascript side is viable, since I need: - client be able to work offline for days - still have correct timestamps on things in datomic (time of update, not time of sync) E.g. update thing on Monday, send it to Datomic on Friday, and on Saturday, as a result, be able to see that thing was updated on Monday by going to Datomic only (no help from datascript at this point)
@misha, we track client-side timestamps per-event (which is the source of truth for timing on those), and batch transact them