This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2016-02-10
Channels
- # aatree (4)
- # admin-announcements (1)
- # beginners (62)
- # boot (279)
- # business (14)
- # cider (1)
- # cljsrn (3)
- # clojure (88)
- # clojure-czech (3)
- # clojure-madison (2)
- # clojure-poland (117)
- # clojure-russia (74)
- # clojurescript (168)
- # core-async (8)
- # css (6)
- # datavis (39)
- # datomic (67)
- # devcards (2)
- # dirac (1)
- # editors (9)
- # emacs (13)
- # events (2)
- # hoplon (2)
- # jobs (9)
- # ldnclj (38)
- # lein-figwheel (9)
- # leiningen (7)
- # luminus (4)
- # off-topic (77)
- # om (114)
- # omnext (1)
- # onyx (221)
- # parinfer (10)
- # portland-or (5)
- # proton (3)
- # re-frame (24)
- # reagent (14)
- # ring-swagger (13)
Are there any command line tools for importing TSV files into Datomic? (Assuming an existing schema, just want to transact in new facts, ideally with a low startup time cost)
@onetom happy to share more details about how we do testing by forking connections if the blog post is not enough
I may release a sample application or Leiningen template at some point
@val_waeselynck: that would be really great!
i was using this function to create an in-memory db with schema to serve as a starting point for forking in tests:
(defn new-conn
([] (new-conn db-uri schema))
([uri schema]
(d/delete-database db-uri)
(d/create-database db-uri)
(let [conn (d/connect db-uri)]
@(d/transact conn schema)
conn)))
have you released this mock connection as a lib anywhere yet? if it served you well so far, it would make sense to create a lib, no?
actually i would expect cognitect to supply such a solution out of the box if it is a really sound approach as @robert-stuttaford hinted above
@onetom: yes I'll probablye roll out a lib soon, just wanted to get some criticism first
My next blog post will be a guided tour of our architecture, so it'll probably cover this in more details
And I wouldn't be surprised if this was actually the implementation of Datomic in-memory connections
performance is not the biggest win IMO, being able to fork from anything is
including your production database, I do it all the time
that's what i was missing from your article. you haven't established a baseline which your are comparing your solution to, so im not sure what would be the alternative approach and how much faster is it to use the mock connection
that sounds a bit risky to work w the production fork, no? i always work on restored backups, but our db takes only a few seconds to restore still, so that's why it's viable atm
@val_waeselynck: your article inspirational, will def try that for us as well
why risky ? once you fork, it's basically imossible to write to the production conection
(well , granted, the risk is that you forget to fork :p)
@pesterhazy: thank you this encourages me to roll out a lib then
that would be very useful I think
just the mock connection itself would be great as a lib
@onetom: anyway, I'm generally not too worried about accidental writes with Datomic, they're pretty easy to undo
@val_waeselynck: your test example is the most heartwarming thing i've seen in a long time that's how i always hoped to describe integration tests and now you made it a reality by putting the dot on the I (where I = datomic )
now if someone could build a better deployment strategy for datomic on AWS with live logging, that'd be great too (I just had the prod transactor fail to come up twice, without a way to find out what the problem was; only to work the third time, for no apparent reason)
@val_waeselynck: are you using any datomic wrapper framework, like http://docs.caudate.me/adi/ or something similar?
@onetom: no, never heard of such a framework 😕
quite happy with datomic's api (except for schema definitions)
but then migrations become tricky if u have a source file representing your schema, since the DB itself is not the single place of truth anymore
@onetom @pesterhazy I gotta run but happy to discuss this further, actually it would be really nice if you could persist your main questions and critics as comments of the blog post, so others can benefit from it :)
@pesterhazy: that logs rotate from the transactor rather than stream is problematic for me too. it’s made logs totally useless for every instance that our transactors failed in some way
So is it just me or does the Datomic client just never return when submitting a malformed query?
Like this one:
(d/q '[:find (pull ?be [*])
:where $ ?id
:where
[?be :building/building-id ?id]]
(d/db @conn)
2370256)
@casperc: this doesn't hang for me:
(defn idents [db]
(q '[:find ?eid ?a
:where $
:where
[?eid :db/ident ?a]] db))
(->> (new-conn) db idents pprint)
@robert-stuttaford: exactly. you have logs, but only the next day and only in case nothing goes wrong (which is precisely the case where you're not particularly interested in the logs)
it'd be already helpful to be able to specify a logback.xml so you can set up your own logging
we use http://papertrailapp.com and it’d be great to use logback’s syslog appender with that
I know that this is possible in principle by hacking the startup scripts, but that's way harder and hit-and-miss than any admin would like
we use papertrail as well
it's great
the other thing the ami's are missing is the ability to pull in your own libraries (which you require when you use them in transactor fns)
@onetom: Weird. Thanks for testing it though. What are you getting as a return value?
or i was getting this error:
java.lang.Exception: processing rule: (q__23355 ?name ?ip ?cluster-name ?cluster-subdomain), message: processing clause: [$rrs ?subdomain ?ips], message: Cannot resolve key: $rrs, compiling:(ui/dns.clj:74:1)
I am wondering a bit about lookup refs. It looks like they throw an exception when the external id being referenced, is not present which I think is fair. For my use case, I just want the ref to be nil (or not added). Any way to make the lookup ref optional?
@pesterhazy: and @robert-stuttaford definitely understand the point around log rotation vs. streaming. Re: launch problems, we did add a section to the docs on troubleshooting common “fails with no info” issues under “Transactor fails to start” here: http://docs.datomic.com/deployment.html#troubleshooting — adding to lib/ and configuring different logging, though, definitely fall under the use case (at least at present) for configuring your own transactor machine vs. using the pre-baked AMI.
We do hear and consider your feedback there, but nothing short term to promise on those topics.
@bkamphaus: I realize it's a larger undertaking, not blaming you
I'm probably going to build an amazon playbook to set up datomic on ec2+dynamo, that should make things a lot easier for folks
those docs are great, @bkamphaus ! thanks for taking note
@casperc: in what context? Query / transaction / entity ?
@pesterhazy: yes please