This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2024-04-28
Channels
- # announcements (2)
- # babashka (21)
- # beginners (24)
- # calva (9)
- # cider (7)
- # clj-kondo (12)
- # clojure (116)
- # clojure-europe (5)
- # clojure-korea (2)
- # clojure-norway (3)
- # clojure-poland (1)
- # clojure-spec (5)
- # clojurescript (12)
- # cursive (12)
- # datomic (8)
- # google-cloud (4)
- # honeysql (16)
- # java (18)
- # lsp (10)
- # missionary (14)
- # polylith (12)
- # re-frame (13)
- # releases (4)
- # shadow-cljs (10)
- # sql (10)
- # testify (2)
Hi there. What is the recommended approach to dispatch multiple effect handlers asynchronously? In https://github.com/migalmoreno/tubo I often times need to dispatch the same effect handler multiple times for a given collection, such as when I need to https://github.com/migalmoreno/tubo/blob/master/src/frontend/tubo/events.cljs#L481, since I need to re-fetch the audio stream for each of these items. More recently, I've been working on an import/export feature for playlists and I've ran into the same issue. When exporting, I save a list of stream URLs for each playlist, and when I import them I need to re-construct the actual videos that belong to each playlist, which I'm currently doing this way:
(rf/reg-event-fx
::load-bookmark-list-stream
(fn [{:keys [db]} [_ bookmark idx res]]
(let [stream-res (js->clj res :keywordize-keys true)]
{:fx [[:dispatch (if (= idx 0)
[::add-to-likes stream-res]
[::add-to-bookmark-list bookmark stream-res])]]})))
(rf/reg-event-fx
::fetch-bookmark-list-stream
(fn [_ [_ uri bookmark idx]]
(api/get-request (str "/streams/" (js/encodeURIComponent uri))
[::load-bookmark-list-stream bookmark idx] [::bad-response])))
(rf/reg-event-fx
::add-imported-bookmark-list
(fn [{:keys [db]} [_ index bookmark]]
(let [new-bookmark {:name (:name bookmark)
:id (nano-id)}]
{:fx (into [] (apply merge (if (not= index 0) [[:dispatch [::add-bookmark-list new-bookmark]]] [])
(map #(identity [:dispatch [::fetch-bookmark-list-stream % new-bookmark index]])
(:videos bookmark))))})))
(rf/reg-event-fx
::add-imported-bookmark-lists
(fn [{:keys [db]} [_ bookmarks]]
{:fx (map-indexed #(identity [:dispatch [::add-imported-bookmark-list %1 %2]]) bookmarks)}))
Problem is I don't have any control over the order of the fetching of videos in fetch-bookmark-list-stream
and the playlists end up with the wrong order of items. So in short, is it recommended to do this sort of dispatch of multiple effect handlers, one for each element in a collection? And how can I coordinate their execution and make sure they are done asynchronously?You don't need the ::fetch-bookmark-list-stream
event at all if you control the api/get-request
function and/or what it uses.
You can just use it or a somewhat modified version of it to populate the vector you feed to :fx
since it accepts any effects, not just :dispatch
.
:fx
guarantees the order in which effects are executed. But it can't possibly guarantee the order of anything those effects themselves do asynchronously (such as network requests).
You already feed idx
to the event that deals with results of those stream requests - just use it to insert data at the right index instead of adding it to the end of the list.
A couple of unrelated notes:
• merge
on vectors is confusing - conj
can be used instead. In your case, apply merge
can be replaced with into
.
• #(identity ...)
is also confusing - (fn [x] ...)
is both shorter and more legible.
Thanks. api/get-request
is just a wrapper around :http-xhrio
. I usually have these fetching functions separate from the rest of the logic in case I'll want to reuse them later, but you're right that in this case it's unnecessary.
Yeah, the idx
is used for the playlists, not the streams in them, so that items in the first playlist (Liked Streams) will get always get merged in the first playlist, and subsequent streams get added to new playlists. But I see your point about adding an index for each stream and I'm going to give it a shot.
I guess more generally I wanted to better tell the difference between this enqueueing of fx
and using promises like this:
async addImportedPlaylist(playlist) {
let newPlaylist = await createPlaylist(playlist.id);
let videoIds = playlist.videos.map(video => video.url);
await addVideosToPlaylist(newPlaylist.playlistId, videoIds);
}
Which as far as I know will do everything asynchronously, whereas the re-frame approach doesn't.I also wanted to add a notification once all playlists have been imported and the items in them have been refetched, how would I go about signaling this?
The only thing that comes to mind for these kind of tasks is https://github.com/day8/re-frame-async-flow-fx, but in my case I'm using the same effect handler many times instead of different ones
> Which as far as I know will do everything asynchronously, whereas the re-frame approach doesn't.
It will also do everything asynchronously.
Re-frame's queue is async in the sense that it gets run only when it's allowed to.
And it can't go around async JS API.
> I also wanted to add a notification once all playlists have been imported and the items in them have been refetched, how would I go about signaling this?
Two reasonable options:
• Have an explicit or an implicit counter. Explicit - something like :tasks-remaining 7
. Implicit - something like :results [...]
where the actual counter is (count results)
• If you don't care about whether re-frame can see the internals of the stream fetching process or not (e.g. for re-frame-10x or for some interceptors), you can combine all logic into a single function that's independent from re-frame. From the re-frame's perspective, it will be like calling js/fetch
, maybe with some progress reporting like :on-step (fn ...)
.
Don't use re-frame-async-flow-fx, it's not for these kinds of workflows.
Gotcha, thanks for the clarification. With the first approach I understand I would have a counter for the number of streams left to fetch, and that I would increase this on the on-success callback of each, but where would I check for the tasks-remaining
? The second approach is simpler, if I understand correctly. I basically would delegate all the fetching logic to an async function (e.g. getImportedPlaylists
) which could compute all the streams for each playlists using standard JS async utils (like the example I showed above) and return the playlists when the Promise is resolved. I would then call this function in a re-frame effect handler and simply dispatch as many ::add-bookmark-list
effect handlers as there are playlists in the Promise response
> but where would I check for the tasks-remaining
?
Depends on why you need that value.
Probably in one of the events that also changes that value. If the value after the change is 0, some other event is triggered.
Wrt to the second approach, wouldn't having an async function to retrieve the playlists info (the one analogous to js/fetch
) being called in an effect handler violate its purity?
Just to let you know, I ended up going with the second approach and using https://github.com/smogg/re-promise