Fork me on GitHub
#datomic
<
2015-07-17
>
martinklepsch14:07:37

Do people using Datomic in Clojure projects usually run a separate Datomic instance for development or do you use an in-process version?

martinklepsch14:07:00

@bkamphaus: The bin/maven-install script (and maybe others) don’t have a #!/bin/sh — maybe a good idea to add it?

statonjr14:07:56

martinklepsch++

statonjr14:07:18

I have a bootstrap script that adds it

tcrayford14:07:44

@martinklepsch: local transactor for dev, in memory db for tests

martinklepsch14:07:33

local transactor = bin/transactor conf.properties did I get the lingo right?

martinklepsch14:07:16

If you upgrade Datomic moving a db from old to new is done by backup & restore basically?

tcrayford14:07:40

you can typically just run the new transactor against the old db

tcrayford14:07:58

several times they've changed the format of what's in storage and done that via code in the transactor, which means no backup/restore

tcrayford14:07:07

for local storage, that just means copying data out of the old transactor

Ben Kamphaus15:07:39

one thing to note for the memory db is that it’s sufficient for testing ACI but not D aspects of Datomic's ACID semantics. I.e., it has no log, which underlies durability in Datomic.

Ben Kamphaus15:07:59

so you’ll need e.g. local dev storage if you have any testing around Log API, excise, etc.

Lambda/Sierra15:07:00

@martinklepsch @tcrayford The on-disk format has changed extremely rarely in the past, and is documented in release notes. Most transactor upgrades are just restarting the process with a new JAR.

tcrayford15:07:08

just the same as prod simple_smile

val_waeselynck16:07:48

Regarding a question I asked a few days ago about schema declaration: What do you think of using transaction functions to make declaring schema attributes less tedious? Here's a POC: https://gist.github.com/vvvvalvalval/fe16f475b1656f28d4b8

Ben Kamphaus16:07:36

the use case is a matter of opinion I’ll stay out of simple_smile but re: making it a transaction function, in general I would avoid tx functions in cases where you don’t need transaction-time access to the database value.

val_waeselynck17:07:35

@bkamphaus: thanks, what do you imagine could go wrong?

Ben Kamphaus17:07:02

@val_waeselynck less “going wrong” and more about best fit. Data munging (producing a schema attr map from defaults/terse names) makes sense to me as library/API code. Transaction functions (the occasional dorky illustrative example aside) are really about enabling logic at transaction time that ensures valid transitions between state a la ACID isolation (using serialized writes as the mechanism), i.e. adding/subtracting from a balance.

Ben Kamphaus17:07:34

there are use cases for validation or helper functions that make sense as tx functions, b/c they would require transaction-time access to the db value

Ben Kamphaus17:07:29

you can run into issues by over-relying on transaction functions, mainly perf issues since tx function logic is performed in serial, a few commonly used tx functions w/non trivial perf characteristics can tank the throughput of a system

Ben Kamphaus17:07:48

though the perf impact case is less likely to be problem w/schema install

val_waeselynck17:07:45

Yeah I wouldn't worry about performance in this case ^^

val_waeselynck17:07:40

The thing is, in this case, the code has 2 purposes: being processed by the database (data) and being a reference for the application developer (which makes it a big deal to reduce noise)

tcrayford17:07:33

@val_waeselynck @bkamphaus I'd utter some things about the suitability of abstracting away schema as well. Whilst the default definition is relatively verbose, it's easy to understand for the most part. I'd worry about losing that by moving to a more terse syntax.

val_waeselynck17:07:50

As for the fact that it does not use transaction-time access to the database, I don't really see it as a problem. The documentation itself says other uses for database functions may yet be found. http://docs.datomic.com/database-functions.html

tcrayford17:07:43

Also, depends on your app, but I think in most apps schema changes are relatively rare, and so making them easier isn't a good economic tradeoff (at least for the apps I've worked on)

val_waeselynck17:07:14

@tcrayford: maybe if you have a very good memory simple_smile but I often find myself consulting my schema declarations quite often e.g to remember attribute names.

tcrayford17:07:56

also depends how big the schema is 😉

val_waeselynck17:07:25

@bkamphaus: I do note your point about not mixing library code with data though

robert-stuttaford17:07:34

val: a dev-time dev/user.clj fn that prints out all your schema should save you some time simple_smile

tcrayford17:07:30

@robert-stuttaford: before cursive, I had a keybinding that "autocompleted" schema attributes (easy to wire up with ido-mode or similar)

robert-stuttaford17:07:30

definitely agree with keeping schema simple: data. dsls are nice for writing it, but that’s about it. you lose when you stop keeping it as data. we use .edns with plain datomic schema defs

val_waeselynck17:07:18

Mmmh still can't think of a situation where this goes wrong. I think I'll be dumbly stubborn and tell you what happens when it gets ugly.

val_waeselynck17:07:27

@robert-stuttaford: do you have an example of such helper? would you mind posting the code?

Ben Kamphaus17:07:51

Something like this should work for grabbing schema, we have a java example as well: https://github.com/Datomic/datomic-java-examples/blob/master/src/PrintSchema.java

(let [db (d/db conn)]
  (clojure.pprint/pprint
    (map #(->> % (d/entity db) d/touch)
      (d/q '[:find [?a ...]
             :where [_ :db.install/attribute ?a]]
         db))))

timothypratley19:07:58

it there a sensible way to sync datoms to datascript? (query a datomic server and put the datoms into a client side datastructure datascript)