Fork me on GitHub
#datomic
<
2016-12-09
>
danielcompton00:12:44

@christianromney does the docker image comply with the license? I thought redistribution of binaries wasn't allowed

christianromney00:12:29

@danielcompton no redistribution. We use an ONBUILD approach to automate some download steps.

christianromney00:12:24

you’ll need to register for a DPS license and then add your key. we just automate curl, etc

shaunxcode00:12:21

is it safe to assume the datomic peer server/client are not using websockets?

shaunxcode00:12:09

hmm wait nm I just saw http://docs.datomic.com/architecture.html which does indicate "http + transit" (which means it could be websocket? or rest couple with SSE?)

karol.adamiec12:12:30

how does one represent a 3d vector [x y z] in Datomic? is db.type/float cardinality many good enough? i need positions to be fixed, so it works. Going all way out and defining separate entities for x,y,z seems like a lot… ?

karol.adamiec12:12:37

boils down to: does cardinality many guarantee the order of items?

karol.adamiec12:12:53

i can live with not being able to limit it to 3

rauh12:12:36

@karol.adamiec If you don't query with it: Why not just use (float-array 3 [1. 2. 3.])? And store as bytes

karol.adamiec12:12:37

@rauh sounds good. What would be literal syntax for that? i mean in edn file with just data for db seeding?

karol.adamiec12:12:15

and to wrap up prev q, cardinality has set semantics.

karol.adamiec12:12:45

@rauh (d/transact conn [{:db/id #db/id[:db.part/user] :part.spec/position (float-array 3 [1. 2. 3.])}]) throws [F@5b35c7d is not a valid :bytes for attribute :part.spec/position"

karol.adamiec12:12:41

i need to convert somehow

karol.adamiec13:12:52

aww, i think bytearray will not work over REST

karol.adamiec13:12:07

i get back [:db/id 17592186122653] [:part.spec/position #object["[B" 0x166ad03 "[B@166ad03”]]

leov15:12:07

hihi. quick question - can I do range queries on attributes in datomic?

leov15:12:22

say I have [1 :process/name "ls"] [1 :process.env-var/LOCALE ".."] [1 :process.env-var/HOME "/home/.."]

leov15:12:32

can I query all the env-var keys?

leov15:12:46

or I should not model env-vars of a process this way?

tjtolton15:12:38

So, is there a good tutorial anywhere on how to sell datomic to your company? key questions to answer being 1) why do we need to change? 2) what happens to all of our existing data?

danielstockton15:12:08

If you need to track history, that helps 1). There isn't really a comparable alternative and if you've tried to do it with another solution, you probably understand the pain.

tjtolton15:12:22

everyone gets along fine when they don't have a choice. I need to convince people that they have pain that they dont even realize they have.

danielstockton15:12:29

don't you have a lot of nasty code and extra tables to track history that you can point to and say 'let's get rid of this?'

danielstockton15:12:08

maybe for 2), you can load the data into datomic and demo something working

tjtolton15:12:23

how does one load relational data into datomic?

danielstockton15:12:10

There was a talk on this subject at the recent conj but it's quite high level: https://www.youtube.com/watch?v=oOON--g1PyU

tjtolton15:12:41

awesome, ill definitely take a look at that

tjtolton16:12:07

@nrako was that in response to me?

tjtolton16:12:30

oh, a mysql transfer, neat!

tjtolton16:12:14

the hell is onyx?

tjtolton16:12:30

ive seen that name popping up lately

nrako16:12:19

Conceptually had the same question, and that example really simplified the data import/export process for me

nrako16:12:57

Onxy is primarily for stream processing

tjtolton16:12:52

@nrako stream processing? like a kafka pipeline?

nrako16:12:23

You can read in data from Kafka, process, and send it along to where it needs to go (https://github.com/onyx-platform/onyx-kafka). Not sure if this answers your question...

nrako16:12:47

So it does not replace Kafka

tjtolton16:12:33

right, didn't mean replace. I more meant that it's well suited for a kafka data stream system

nrako16:12:04

I don't have much experience with it, but everything is data instead of dsl

gdeer8117:12:24

introduced the intern to Datomic this morning and he kept saying he was doing some "Datalogging" when he was writing queries and that a "Datalogger" is someone who writes datalog queries. kids say the darndest things