This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-01-30
Channels
- # bangalore-clj (1)
- # beginners (104)
- # boot (207)
- # cider (173)
- # cljs-dev (157)
- # cljsjs (1)
- # cljsrn (51)
- # clojure (196)
- # clojure-berlin (1)
- # clojure-chicago (1)
- # clojure-italy (4)
- # clojure-new-zealand (1)
- # clojure-nl (1)
- # clojure-russia (28)
- # clojure-spec (17)
- # clojure-uk (73)
- # clojured (13)
- # clojurescript (110)
- # core-async (4)
- # datascript (25)
- # datomic (92)
- # editors (1)
- # emacs (157)
- # events (4)
- # hoplon (16)
- # klipse (74)
- # lein-figwheel (10)
- # leiningen (2)
- # lumo (13)
- # off-topic (78)
- # om (3)
- # om-next (3)
- # onyx (14)
- # protorepl (1)
- # re-frame (17)
- # reagent (23)
- # remote-jobs (1)
- # ring-swagger (33)
- # schema (2)
- # slack-help (3)
- # spacemacs (7)
- # testing (1)
- # yada (7)
@ezmiller77 yes. define two attributes 🙂
could someone help me with altering the :db/fulltext
attribute, please?
i have this tx:
[{:db/id :content-item/description
:db/fulltext true
:db.alter/_attribute :db.part/db}]
and datomic gives me an error saying:
:db.error/invalid-alter-attribute Error: {:db/error
:db.error/unsupported-alter-schema, :attribute :db/fulltext, :from
:disabled, :to true}
wait nvm
you cannot alter fulltext
@nooga That is essentially correct. With a starter license you will get 1 year of maintenance and upgrades. After which, you will still be able to use your license/Datomic but will not be able to upgrade to any new releases.
@biscuitpants :db/fulltext
cannot be altered. So you cannot add full text search to an attribute after its creation. You will need to create a new attribute, pouring in or importing values from the old attribute. http://docs.datomic.com/schema.html
yes, thank you @jaret! maybe the docs should be updated to give an example of how to do it?
(its not a super straightforward process)
or, it is, but its not completely evident 🙂
you do, however, have a disconcertingly robert-stuttaford-like avatar photo, @biscuitpants
ha, we have a discussion about this often at work
even the position of the head relative to the frame
possibility that we planned it
but @robert-stuttaford would have to confirm
i ain’t dun nuffin
i’m the handsome one 😁
Does anyone know where I would find info on how to query for transaction annotations? My Google searching has yielded nothing (also, I’m a newbie at datomic). I’ve added an annotation to a transaction and want to write a test to ensure it added the annotation.
here's an example: https://github.com/Datomic/day-of-datomic/blob/59186b4b39c124e2d9d0e79243f3e373b0a0b9d9/tutorial/provenance.clj
@bmaddy Transactions are normal entities. You need to get either a t or a tx somehow, then just access it as you would any entity
or this one: https://github.com/Datomic/day-of-datomic/blob/59186b4b39c124e2d9d0e79243f3e373b0a0b9d9/tutorial/log.clj#L31
@shaun-mahood I was wondering similarly recently, how to handle geo data with/from Datomic. Besides PostGIS, I wonder if ElasticSearch could do spatial searches on properly structured Datomic data. Found anything so far?
Hmm, I’ll try that out. Thanks @pesterhazy & @favila.
@limist: I've done a very brief amount of research and it looks like it would be a lot of work to implement anything performantly - essentially, implementing r-trees and adding some sort of indexing that PostGIS and other GIS systems already have built in. I can't see it being anything easy or quick, so in all likelihood it makes more sense to keep the location info outside of datomic and figure out a good way to link things together. I've only just started looking into GIS stuff in general so I could be wildly off base on anything at this point though. 🙂
I’m gathering data from a bunch of sensors, they send me timestamps with their readings. I can’t trust their clocks and I can’t trust they they will send their readings from in order. On the other hand, I wand these timestamps in my DB since reading times can significantly differ from acquisition times. How should I approach this?
@nooga As a normal data-modelling problem. Datomic's time/history features concern "meta" time (i.e. time of record) not data-model time. Think git commit or file-modified times.
@nooga Sometimes you can get these to line up and you can use txInstant to store all time info, but it sounds like you can't in your application
I'd like my entities to reflect the sensor state at any given time, I can store the timestamp as an attribute there, but this doesn’t guard me from out of order scenarios
hi, i need to model a ‘composite key’. Any best practices for that? I was thinking about using a transaction function to create the ‘real’ key from the associated fields
@nooga but that's exactly what you need, right? you need to store sensor data as a domain time so you don't confuse it with the time you recorded the sensor data
thanks ! @favila also, you helped me with a tree waking approach a while back over in google groups. I’m trying to generalize it to search ‘up and down’, but i’m having some issues, I realize what I did wrong in what I posted over there, but still not sure how to make it work lol
@nooga if you want to use datomic time-travel abilities to explore domain time, consider using two dbs: one for raw data, and a projected one where you transact sensor data in domain-time order (asserting txInstant on each tx)
@eoliphant the "ancestors" query is inefficient when working backwards, think about what var is bound when evaluating the recursive rule with your change and what the query eval would have to do to fix it
@eoliphant you can use the [?var] syntax to force a var to be bound, so you won't let the query engine try to eval rules in the "wrong" direction
(in fact ancestors
is badly named without it, since rules are inherently bidirectional)
yeah that’s whhere I messed up initially, I just tried to be cute and switch the internal call, but the [?var] wasn’t switched
I guess I can ask datomic about values changed between T1 and T2 and then sort results by domain time?
@eoliphant try to write a blood-relations
rule with three impls
one that handles parent<->child (in any direction), one that follows ancestors, and one that follows progeny
@nooga depends on what you are after. If you just want to sort all sensor data by the time it reported, a simple indexed date field is enough
in what way does it matter that the sensor time is not trustworthy? you want to rely on transactor's time? you want to correct sensor time in a second pass?
Do you want to use datomic as-of/since to explore a reconstructed view of the world as seen by the sensors? or you don't care about that and have some application on top which does it? or you just want to store it, not process/query at all
The sensors sometimes don’t have realtime clocks on board, somethimes they don’t have any NTP configured etc. I might want to prefer transactor time over sensor time in some cases.
My idea was to simply update facts about sensors rather than treat each observation as a new entity related to the sensor.
My transactions would look like this: [sensorid :reading/time 123456] [sensorid :reading/temp 19.5] [sensorid :reading/pressure 1012]
You could do that, but keep in mind the history you see is always the history sensor data was recorded, not time it happened in the world
This is why I suggested generating a second db as a projection, where :reading/time
is corrected and used as the txInstant
keep in mind if you go this route, as-of/since is really just useful to debug your applications (e.g. whatever collects sensor data), not so much to scrub through old sensor readings
by confusing transaction time and reading-time you may make your life harder later (depending on what you plan to do)
but it is still possible to use the history database and pull out the individual observations if you need
also keep in mind that you may need to batch writes for performance, which will further confuse the sensor-time vs. record-time
@nooga Likely there are better sensor data db packages out there. It may still be useful to create a projection in datomic, as I said
I was looking at datomic because its fact orientation and time travel abilities seem really nice for this case
we’re collecting facts about the world, from multiple sources, without making any hard assumptions about their validity and reliability
seems like you need a write silo (kafka queue, time-series db, etc) to store observations, and an aggregation silo for reads and exploration
That would still work for near-real-time but I would use datomic to make throwaway projections
I have more experience with heterogenous data (records, documents, human artifacts, etc) not homogenous data so I can't give much more advice