This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-02-06
Channels
- # announcements (2)
- # architecture (2)
- # aws (18)
- # babashka (7)
- # beginners (149)
- # bristol-clojurians (4)
- # calva (11)
- # chlorine-clover (1)
- # cider (8)
- # clj-kondo (2)
- # cljdoc (2)
- # cljsrn (2)
- # clojure (186)
- # clojure-canada (3)
- # clojure-europe (3)
- # clojure-gamedev (5)
- # clojure-italy (1)
- # clojure-nl (13)
- # clojure-norway (4)
- # clojure-spec (25)
- # clojure-uk (32)
- # clojurescript (75)
- # core-async (2)
- # cursive (16)
- # data-science (3)
- # datomic (20)
- # docker (1)
- # emacs (26)
- # fulcro (7)
- # graphql (1)
- # incanter (1)
- # leiningen (1)
- # luminus (7)
- # malli (7)
- # mount (11)
- # off-topic (19)
- # pathom (15)
- # re-frame (9)
- # reagent (9)
- # remote-jobs (4)
- # ring-swagger (4)
- # shadow-cljs (63)
- # spacemacs (11)
- # sql (2)
- # vscode (7)
is there a way to limit concurrency for the parallel-parser
? a large query I have immediately calls a single resolver 900 times. processing in the resolver is limited to a number of threads so most of the calls end up timing out.
it depends on what query I run though, so I'm not sure I want to make resolver adjustments for individual queries
another issue is that my resolver is already a batch resolver, it just gets called for 900 batches
I've tried increasing ::pp/key-process-timeout
, but individual resolver calls seem to time out regardless
@ak407 is this related to the changes you did? can you send some smaller example that demonstrates the problem you are facing?
@wilkerlucio I'm fairly sure it doesn't relate to my PR since it's also happening with non-batch resolvers. I'll try to create a small example, but I'll probably only get to it on the weekend.
I think I have lots of batches because of my joins: a future parser could probably optimize this too and prepare a single batch across multiple levels of joins (this probably isn't always the right thing to do for performance though)
@kenny you would have a problem if you try to cross it on a boundary, but if you want to get it and use on the same process, it can work
but that would just be a value as a input stream, pathom will not do anything special about it