Fork me on GitHub

Hello everyone, I’m sending some PDF’s via AJAX, using http-xhrio:

 (fn [{db :db} [_ type event]]
   (let [file (first (array-seq (.. event -target -files)))]
     {:db         (assoc-in db [:files type :loading?] true)
      :http-xhrio {:method          :post
                   :uri             "/api/document"
                   :timeout         8000
                   :response-format (ajax/json-response-format {:keywords? true})
                   :body            (form-body {:user-id 4
                                                :type    type
                                                :file    file})
                   :on-success      [::success type]
                   :on-failure      [::fail type]}})))
It works, but how can I synchronize this effect timeout with back end? If I set timeout to 2000 and I send a 13 Mb file, the effect (as expected) dispatchs ::fail , but if I check my DB the file is stored correctly. Is there a value defined on ring that I can use as timeout?


The question sounds like it belongs to #clojure or even #ring, not to #re-frame.


Uhh, okey. Sorry! I was wondering if ajax has some mechanism to calculate it.


Well, you can abort requests from the front-end side, but: • Due to async nature of things, it might very well be that the request is actually completed right at the time when you cancel it, so it's not a guarantee • The :http-xhrio effect doesn't have any way to cancel requests, so you'd have to figure out whether the underlying cljs-ajax library supports that, and if so, implement a new effect that supports cancellation


Ups, but I don’t want to abort a request. I just want to give enough time in the timeout in order to don’t show my user an error message while their file is being sent and then it uploaded successfully.


Will there be any convention in this regard (proportional to the size of the files)? :thinking_face:


Then I'm afraid I don't understand what the desired outcome is, at all. Do you want to have a timeout or not? If not, then why not just skip :timeout?


I was looking for a value for timeout that would be generous enough to not cause ::fail to be thrown erroneously with larger files


I will skip timeout to avoid this kind of errors. Thanks a lot!


Yeah, you can't magically set the :timeout value to something that would let large files to go through but fail on requests that are genuinely hung up. No problem!