Fork me on GitHub
#re-frame
<
2022-02-17
>
tobias09:02:22

In my re-frame app I'm querying a 3rd-party API that has quota limits of 10 queries per second and 10 concurrent requests. Any advice on a simple way to respect those limits when I have a batch of 50-100 requests to make? So far I've made each request its own event and spaced out the requests with re-frame's built-in dispatch-later effect (https://day8.github.io/re-frame/api-builtin-effects/#dispatch-later). That succeeds in respecting the queries-per-second quota, but sometimes I still exceed the concurrent-requests quota. I'm thinking that what I could do is that in the event handler for each request I can increment a concurrency counter in the app db, and then each time a request completes I can decrement that concurrency counter. If the concurrency counter is over quota then in the event handler instead of making the request I can re-dispatch the request event with a random delay. Is that a sensible approach or is there a better/simpler way that I'm overlooking?

p-himik09:02:30

I think that implementing a wrapper around the effect that you use for the requests would be a better solution. The wrapper would have its own state and could be flexible to let you adapt any rate limiting strategies without having to incorporate them into your app's state. It also sounds quite reusable, so perhaps there is such an effect out there somewhere. Or maybe you can open-source your own if you decide to go with that approach. ;)

isak18:02:07

Store 3 core.async channels in state: • One unbounded one that is basically a queue of requests to make • One that is bounded to the number of concurrent requests (10) • One that you put results on (also unbounded) Then have some code that will take from the queue, first put a nonsense value, like :foo on the bounded channel, make the request, then take from the bounded channel, then put it on the results channel. If you do the blocking puts correctly in a go block.

DrLjótsson18:02:34

Maybe modelling as a state machine would be helpful. https://github.com/ingesolvoll/re-statecharts

tobias03:02:41

Awesome thanks for the advice everyone. I will try implementing the rate-limiting separately from re-frame's event loop. The core.async channels approach sounds like it could work although I'm not sure how I'd manage errors.

1
isak15:02:32

In our case, we put stuff something like {:success? false :errors (:errors response)} on the output channel if there is an error. Then the loop that is processing the responses can decide what do as the results come in (or wait for all of them):

👍 1
tobias06:02:35

Update: I found a js library that seems like it does exactly what I need in terms of both rate and concurrency limiting https://www.npmjs.com/package/bottleneck

Sam Adams21:02:58

Hey all. I’m working on an audio playback UI (using re-frame), and trying to implement a cursor that indicates the time position during playback. I’m trying to see if I can make its motion super-smooth by using CSS animations; basically, if the audio is 10,000ms long, I render the cursor at the 0ms mark, and then render it at the 10,000ms mark with a CSS transition property of 10,000 ms linear. Now, my audio playback app allows the user to continuously loop the audio. When the loop event occurs, I need to rinse and repeat — render the cursor at 0ms (without CSS transition), and then at 10,000ms (with CSS transition) in immediate succession. I do this by basically quickly changing the :player-pos-ms key of my app state. My problem: my state toggles very quickly from {:player-pos-ms 10000} to {:player-pos-ms 0} back to {:player-pos-ms 10000}, such that my subscription seems to sometimes elide the middle state — my view is (sometimes) never updated to render the cursor at 0. My question: do subscriptions necessarily “see” and render every discrete state? If not — is there a way I can force a particular state to be rendered and not skipped?

Sam Adams22:02:06

A-ha! Thanks!

👍 1