Fork me on GitHub
#datomic
<
2018-08-27
>
Petrus Theron08:08:59

Are there are any long-term future plans to allow extending Datomic with custom data types, such as sparse matrices to d o the matrix math needed for collaborative filtering, or is this something that could be handled by database functions already + caching?

stuarthalloway14:08:27

@U051SPP9Z the original design on Datomic included consideration of custom data types, but this is not an area of active development. Always happy to learn about your use cases.

marshall13:08:44

@henrik fixed - thanks!

👍 1
manutter5114:08:19

Any conformity people here? I’m trying to use c/ensure-conforms to set up my db schema, and I’m getting “:db.error/not-a-data-function Not a data function: 10”. I don’t have “10" anywhere in my norms, and there’s no reference to any conformity code in the stack trace. This is on an in-memory db (datomic on-prem), so no previous schema in the db.

manutter5114:08:43

Here’s the norm:

{::fz.location
   {:txes [{:db/ident :fz.location/name
            :db/valueType :db.type/string
            :db/cardinality :db.cardinality/one
            :db/doc "Location name"}]}

manutter5115:08:12

Co-worker helped me figure this out: :txes has to be a vector of vectors of maps, not just a vector of maps.

timgilbert17:08:38

Hey, datalog question here. I have an entities with a cardinality-many keyword attribute :x/colors. I want to write a datalog query that accepts a set of color keywords, and returns every entity for which the entire set of :x/colors for the entity is the same as the set I passed in.

timgilbert17:08:55

I'm a little confused about how to do this - using collection binding I only match on a single :color at a time

Mark Addleman18:08:10

Yes, this is the typical approach. You can accomplish this a couple of ways. You could issue one query per element in the set or you can pass the set into a iterable binding (I don’t think that’s the offical term) doing this :in [?color...]

timgilbert19:08:04

The problem with that is that I only get, eg, :blue on any given match, and in that context I don't know whether :red was also passed in the input set

dustingetz19:08:49

YOu can pass in a set. This was asked in the Datomic forum, ill try to find link

dustingetz20:08:13

The secret sauce is calling clojure.core/= for set equality

timgilbert21:08:44

Thanks, I'll take a look at that

timgilbert17:08:41

And then in the context of a datalog clause I'm not sure how to get the entire set of :x/colors for a single entity e.

bmaddy18:08:45

I'm not sure how the iterable binding approach would work, but I'd probably do something like this:

(defn get-colors
    [db eid]
    (let [results (->> eid
                       (d/pull db '[:x/colors])
                       (map :x/colors)
                       set)]
      results))

  (d/q '[:find ?e
         :in $ ?search
         :where
         [?e :x/colors]
         [(foo/get-colors $ ?e) ?search]]
       (d/db conn) #{"blue" "green"})

timgilbert19:08:43

I can see how that would work, but I'm still hoping for a way to do it inside of the query itself

timgilbert19:08:34

I guess I'm hoping for a way to write an (every?) predicate or something

timgilbert19:08:07

I found this StackOverflow answer which seems like it might be heading in the right direction: https://stackoverflow.com/questions/23352421/for-all-in-datalog

cap10morgan19:08:21

Is it possible to do a Datomic restore across AWS accounts? I've given the other account (i.e. the one that owns the destination dynamo table) access to the backup bucket in S3, and I can do aws s3 ls from the CLI w/ those creds, but Datomic's bin/datomic restore-db still crashes with S3 access errors.

marshall19:08:08

you’ll need to make sure the role/credentials used by the restore-db call (peer application) have read permissions to the s3 bucket

marshall19:08:22

that should be all that’s required

marshall19:08:45

keep in mind, aws s3 ls isnt’ the same as being able to read all the objects in that bucket

marshall19:08:58

IAM has separate permissions for getting objects and listing bucket contents

cap10morgan20:08:37

@marshall OK I think you clued me into the right thing here. I need to set this up using IAM, not ACLs.

marshall20:08:12

yeah, it may be possible with ACLs, but I think you’ll have better luck / more control with IAM

👍 1
marshall20:08:09

also, I believe S3 ACLs are a legacy tool that pre-dates IAM or bucket policies

cap10morgan20:08:00

Hmm, the only stuff I'm seeing around this involves assuming roles. I'm not having any luck combining that with a Datomic restore-db command.

marshall20:08:32

run the instance with a role

marshall20:08:45

whatever instance you’re running the restore job on

marshall20:08:27

alternatively, give the role to a user

marshall20:08:32

and run with that user’s credentials

marshall20:08:40

i.e. if you’re running the restore on your local machine

cap10morgan20:08:26

Is there a way to give a role to a user other than aws sts assume-role? I tried that, then tried using the creds it gave me, but those aren't working.

marshall20:08:16

you can assign a user to a group and put the role on that group

marshall20:08:47

actually i may be thinking of policies

cap10morgan20:08:51

Oh, I think I may have figured it out. I needed to set the AWS_SESSION_TOKEN env var too. And now it looks like I need to give this role permission to write to the destination dynamo table b/c it drops existing privs when you assume the role.

cap10morgan20:08:51

Hmm, no. That doesn't work either. It tries to restore to the DynamoDB table in the account that the assumed role gave it access to. So I'm back at square one. I'm not sure how to grant one side of a restore access to another account's S3 bucket and the opposite side access to the primary account's DynamoDB table. 😕

cap10morgan23:08:15

Using a bucket policy that gave permissions to the other account in the Principal field fixed this ^^^. And with no more need to mess with assuming roles.