Fork me on GitHub
#datomic
<
2016-03-29
>
bhagany02:03:46

you could put that pull expression in a query that takes the eids as a parameter

bhagany02:03:27

I didn't have time to elaborate before, but I mean something like:

(d/q `[:find [(pull ?e ~pull-exp) …] :in $ [?e …]] db eids)

bhagany02:03:36

I don't think that syntax quote/unquote will actually work, but hopefully you get the idea

currentoor04:03:19

yeah that could work, thanks @bhagany

currentoor04:03:19

and I can just make the pull-exp an argument as well

tomjack04:03:36

why use a query for that?

currentoor05:03:03

@tomjack: I have a very long list of eids and I wanted to test if batching the query together would be more performant.

currentoor05:03:26

I believe the query engine parallelizes stuff under the hood.

currentoor05:03:13

I suppose I could use pmap myself but hopefully I can rely on people smarter than me to do the parallel stuff.

casperc15:03:12

Anyone know of a good way to count the number of entities in the database?

bostonaholic16:03:17

@casperc: (count (distinct (map :e (d/datoms db :eavt))))

bostonaholic16:03:48

^^ that will count ALL entities, including entities which describe the schema and datomic structure itself

casperc18:03:34

@bostonaholic: Thanks, I guess I was hoping for to be able to count just the “user entities”, but this is close enough for now simple_smile

gworley318:03:41

hi. i'm trying to run the datomic transactor on an m4 aws instance but getting errors like this: https://groups.google.com/forum/#!topic/datomic/IXsSUqMkgGo

gworley318:03:47

unfortunately that doesn't seem to quite suggest a solution for me. in this case i'm just trying to use the dev adapter but seem to be having host problems

gworley318:03:46

i got around this before by having a fixed dns entry that referenced the machine, but in this case i can't do that because this is on a testing instance that is self containing and many copies of will spin up/down

matthavener18:03:50

gworley3: what are your host and alt-host params in your transactor.properties file?

bkamphaus18:03:14

@casperc: you can modify what @bostonaholic provided and use :aevt :user/id (replace with whatever attribute will limit results to those entities). You can also write a query with and use the count aggregate.

bkamphaus18:03:23

[:find (count ?e)
 :where
 [?e :user/id]]

gworley319:03:27

@matthavener: host=localhost. alt-host not set

gworley319:03:10

i tried setting to 0.0.0.0 but didn't help.

matthavener19:03:56

are you using datomic free (h2 storage) with your tests?

gworley319:03:04

no, i have our license key put in

gworley319:03:15

and i'm setting to dev

matthavener19:03:35

from the exception, it looks like H2 storage org.h2.jdbc.JdbcSQLException

matthavener19:03:23

i’ve had issues with that when the host doesn’t know its ‘real’ IP (docker).. i had luck setting host=0.0.0.0 and alt-host=<real host ip>

matthavener19:03:35

but that was with H2 storage, which is why I asked

gworley320:03:16

ok, i'll try that

gworley320:03:10

@matthavener: hmm, none of that seems to work whether i use public or private ip addresses or if i use aws supplied dns entries for those two or not. same effect if i use host or alt-host

gworley320:03:13

the (unfortuante) solution seems to be to modify /etc/hosts (i did this and it worked)

bkamphaus20:03:13

@gworley3: in general, I don’t work with dev transactors on ec2 instances, but maybe worth noting that the Datomic generated AWS cf sets host and alt-host as follows:

"host=`curl http:\/\/169.254.169.254\/latest\/meta-data\/local-ipv4`",
"alt-host=`curl http:\/\/169.254.169.254\/latest\/meta-data\/public-ipv4`",

gworley320:03:38

i'm actually not too picky here. i was only using dev because it seemed easiest. i'm trying postgres now to see if that gets around the issue since i'm already running it on these boxes anyway

gworley320:03:04

but i might try that if this doesn't work. thanks!

gworley320:03:28

okay, looks like using sql with postgres works. thanks @matthavener for pointing out about h2. that led me to avoiding it to get around this