Fork me on GitHub
#pathom
<
2019-11-28
>
Björn Ebbinghaus11:11:05

@wilkerlucio I am trying to update the env in a mutation, so that I get the new db in the mutation-join. When I return just {::p/env env} I get a NullPointerException from the reader. Is there anything I have to consider where to update the env?

Björn Ebbinghaus11:11:04

The Exception:

#:decide.model.proposal{new-proposal #:com.wsscode.pathom.parser{:error #error {
 :cause nil
 :via
 [{:type java.lang.NullPointerException
   :message nil
   :at [com.wsscode.pathom.core$join invokeStatic "core.cljc" 423]}]
 :trace
 [[com.wsscode.pathom.core$join invokeStatic "core.cljc" 423]
  [com.wsscode.pathom.core$join invoke "core.cljc" 361]
  [com.wsscode.pathom.core$join invokeStatic "core.cljc" 370]
  [com.wsscode.pathom.core$join invoke "core.cljc" 361]
  [com.wsscode.pathom.connect$mutate_async$fn__28020$fn__28104$state_machine__8372__auto____28135$fn__28138 invoke "connect.cljc" 1516]
  [com.wsscode.pathom.connect$mutate_async$fn__28020$fn__28104$state_machine__8372__auto____28135 invoke "connect.cljc" 1511]
  [clojure.core.async.impl.ioc_macros$run_state_machine invokeStatic "ioc_macros.clj" 973]
  [clojure.core.async.impl.ioc_macros$run_state_machine invoke "ioc_macros.clj" 972]
  [clojure.core.async.impl.ioc_macros$run_state_machine_wrapped invokeStatic "ioc_macros.clj" 977]
  [clojure.core.async.impl.ioc_macros$run_state_machine_wrapped invoke "ioc_macros.clj" 975]
  [com.wsscode.pathom.connect$mutate_async$fn__28020$fn__28104 invoke "connect.cljc" 1511]
  [clojure.lang.AFn run "AFn.java" 22]
  [java.util.concurrent.ThreadPoolExecutor runWorker "ThreadPoolExecutor.java" 1128]
  [java.util.concurrent.ThreadPoolExecutor$Worker run "ThreadPoolExecutor.java" 628]
  [clojure.core.async.impl.concurrent$counted_thread_factory$reify__3036$fn__3037 invoke "concurrent.clj" 29]
  [clojure.lang.AFn run "AFn.java" 22]
  [java.lang.Thread run "Thread.java" 834]]}}}

wilkerlucio12:11:05

@U4VT24ZM3 thanks for the report, I think its a bug, I never used that in a mutation context

wilkerlucio12:11:22

checking on it

Björn Ebbinghaus12:11:00

Glad I could help. And good to know that I don't have to search any further for what I have done wrong.

wilkerlucio12:11:41

just found it 🙂

wilkerlucio12:11:14

just gonna write some tests to cover it and send the fix soon

wilkerlucio12:11:32

are you using deps? if you are, can you try pulling from master and see if works for you?

wilkerlucio13:11:31

thanks, I'll cut a new release soon

magra13:11:47

Hi, is there an alias-resolver that switches the E and V in EAV? I want to declare that

person :person/notes note
is equivalent to or can be resolved via
note :note/person person
Is this even possible?

Björn Ebbinghaus13:11:37

@magra Is this a pathom question? In Datomic you can make reverse lookups. https://docs.datomic.com/cloud/query/query-pull.html#reverse-lookup

magra13:11:05

O it works but it hits the database twice.

magra13:11:19

I write lots of queries [:person/id {:person/notes [:note/id :note/person]] because they preload the normalised DB in the App with the 'backlinks'. At the moment my resolver queries the db for :note/_person, dissoces that and then assoces it as :person/notes. But half the time it hits the DB again to get :note/person becaue it does not 'see' the relationship between :note/person and :note/_person.

magra13:11:24

Maybe I am thinking wrong. But the E in EAV is always implicit.

wilkerlucio13:11:26

@magra I think to get efficient with that, the ideal would be to send the reverse direct to datomic as part of the pull request, had you tried https://github.com/wilkerlucio/pathom-datomic ? still very very alpha, but I think it supports that, have to try and check

magra13:11:11

It has been some time since I checked that. At the moment I do send it directly to the db. If that is the most efficient I will keep it that way. Thanks!!

wilkerlucio13:11:42

that's more true for datomic cloud, on prem it makes little difference

magra13:11:43

I will study pathom-datomic.

wilkerlucio13:11:27

yeah, I'm excited for the new query planner I'm working on, it will be a great improve to the dynamic resolvers integration story (that includes Datomic, GraphQL, Pathom <> Pathom, SQL...)

❤️ 1
Mark Addleman13:11:30

What's the Pathom story around queries with aggregations? I'm working on an analytics product where the user can create arbitrary queries so slice and dice data. We're suffering with my badly designed DSL to describe aggregate queries and I'd love to switch to something much better thought through and battle tested

wilkerlucio13:11:35

Pathom com.wsscode/pathom "2.2.27" was just released, this fixes a but when trying to augment env with connect mutations, also adds support for docstrings (the become data on the resolver map) on pc/defresolver

wilkerlucio13:11:17

hello @mark340, I personally don't have much experience doing it, but I would use EQL parameters to do it, they are an open dimension of information you can use to describe anything you want, you can take Walkable interface as inspiration: https://walkable.gitlab.io/aggregators.html

wilkerlucio13:11:46

one suggestion I have if you go in that direction, use namespaced keywords in your interface definition (the names of the parameters), this way keep it open to integrate with other possible definitions you may want to use in the future

Mark Addleman13:11:31

Cool. I'll check this out. Thanks!

Mark Addleman13:11:12

One more thing: For the most part, our schema is aribitrary key-value pairs. In order to support this in Pathom, I guess I'd write my own resolver that knew how to convert an EDN key-value pair into the appropriate SQL addressing?

wilkerlucio13:11:08

that's some room for options in this space, really depends how you wanna go about modeling it

wilkerlucio13:11:55

so just to see if we are in the same page, your target database is some SQL, is that correct?

Mark Addleman13:11:03

Yep. More specifically, our tables have arbitrarily named columns. It's not too weird from the database perspective. It's just that the UI queries are metadata driven

wilkerlucio13:11:18

given this, you want a solution that automatically makes all the table options available, or something more controled on the API side?

Mark Addleman13:11:23

What does "table options" mean? Generally, we're trying to provide a high degree of flexibility to the API

wilkerlucio13:11:12

I mean, you can try to do things like "introspect my table schema, and generate the resolvers for all of it", or more like: "ok, I'm going to give the user this specific list with these specific fields, and via params I'll customize just filter/aggregation on top of this specific thing"

Mark Addleman13:11:38

Yes. Much more like introspection.

Mark Addleman13:11:44

But perhaps the implementation matters

Mark Addleman13:11:03

We have a metadata that describes all the columns available to the user

Mark Addleman13:11:10

*metadata table

wilkerlucio13:11:10

yup, the instropection version is much harder to get right

Mark Addleman13:11:22

Unlike naive introspection using the db's information schemas, we have a very rich understanding of the columns, their types, etc

wilkerlucio13:11:32

I guess it may be easier if start talking more concrete, hehe, what kind of querys you would like to support in the system?

wilkerlucio13:11:07

or putting more simply, what are the user inputs?

Mark Addleman13:11:17

The user inputs are group-by columns, aggregate columns and functions, and a where clause. the one complication is that the where clause can reference another table that must be joined in

Mark Addleman14:11:19

Oh and time range

wilkerlucio14:11:35

ok, so its some sort of graphic SQL interface

Mark Addleman14:11:40

The generated SQL can be complicated due to UI niceties. For example, the user may ask for some aggregation grouped by day of week. If the data happens not to have Sunday, for example, we still want the

Mark Addleman14:11:01

result set to include Sunday with some default value (null or zero)

wilkerlucio14:11:34

seems like Walkable is a good option for you, currently the bad part about it is that it currently doesn't integrate with Connect, so you don't get the graph traversal and auto-complete things

wilkerlucio14:11:07

but for a lot of the things you are describing, he seems to have already, aggregations, joins, etc...

wilkerlucio14:11:21

so you can convert your user input directly into some EQL representation supported by Walkable, and let it run it

Mark Addleman14:11:24

Ok. Auto complete isn't a requirement - at least, that is being handled by a separate system

Mark Addleman14:11:40

That sounds promising. I'll investigate.

Mark Addleman14:11:43

Thanks a ton

🙏 1