This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-09-22
Channels
- # beginners (104)
- # bitcoin (1)
- # boot (5)
- # clara (3)
- # cljs-dev (14)
- # cljsjs (5)
- # cljsrn (1)
- # clojure (242)
- # clojure-italy (17)
- # clojure-news (13)
- # clojure-norway (3)
- # clojure-russia (101)
- # clojure-spec (41)
- # clojure-uk (87)
- # clojurescript (38)
- # core-async (38)
- # cursive (6)
- # datomic (11)
- # defnpodcast (3)
- # docs (14)
- # editors (8)
- # events (1)
- # fulcro (7)
- # hoplon (25)
- # leiningen (4)
- # luminus (7)
- # off-topic (25)
- # onyx (1)
- # portkey (14)
- # random (1)
- # re-frame (7)
- # reagent (4)
- # rum (4)
- # schema (8)
- # shadow-cljs (257)
- # spacemacs (10)
- # specter (4)
- # unrepl (3)
- # yada (1)
Thanks @alexmiller this is really helpful.
https://pastebin.com/zsJf4N9L how can i start next sync AFTER first one? How to know first one finished?
@kwladyka the first dosync doesn't close until it's body completes, but put!
is async
oh, wait, you are using >!!
- so it doesn't return until all the messages are put
(archai/fetch-epoch #(>!! out-elastic %) input)
- this line is not one time >!!
it is doing this many times inside this function
so at right moment i have to run:
(doseq [epoch epochs]
(>!! in-archai {:url (archai/make-url {:stream "BAR"
:epoch epoch})}))
if i will run it immediately after first one it can be situation like: first on still processing and >!! out-elastic
and second one start doing this at the some time.
@kwladyka what about making two pipeline-async calls, closing the input channel on the first after the first doseq, then waiting on its output (which means all buffered results have completed), then starting up another one to use in the second doseq
by waiting on it's return value right?
it will close the to channel - so I guess you also need a second to-channel
and iirc it returns its to-channel (for convenience)
i tried something like that (<!! (pipeline-async workers-archai out-elastic archai->elastic in-archai))
but it hangs here
did you close its input channel when you finished the puts?
it doesn't close its output until the input is closed and all results from it are delivered
oh… so maybe after
(doseq [epoch epochs]
(>!! in-archai {:url (archai/make-url {:stream "FOO"
:epoch epoch})}))
i should close channel. Inputs inside will be processing and after that (pipeline-async) will unblock <!!like it finish after finish processing all buffered queue not immediately after close channel?
right - you probably need to reorder things somewhat
and it's slightly more complicated because if you close in-archai that means it will close out-elastic, and the function that consumes from out-elastic is the one that will detect that
the more things we need to duplicate here, the more it looks like you need to make a function that gets called twice instead of doing things twice in one function
BTW (dotimes [_ workers-elastic]
do you think this method to create N workers is ok? Any better way?
it's how I'd do it
(<!! (pipeline-async workers-archai out-elastic archai->elastic in-archai))
hmm this one return immediately when i (close! in-archai)
, not wait to the end of out-elastic
oh, hmm...
in right situation i should also know when workers will finish processing out-elastic
, not only when out-elastic
is empty
honestly I'm sick and not totally with it today and should stop trying to help people with code, but yeah, async is intrinsically hard
I'm just realizing I'm not thinking clearly