Fork me on GitHub

What’s a good way to visualize interceptors in general? I’m looking for a common analogy that even a non-tech person may understand.


@decim, perhaps this document will clear things up?


Hi, is there any example using InputStream at response body? I want to support streaming from it and I'm using a PipedOutputStream connected to a PipedInputStream, but it's sending everything only after everything processed. I'm using future to handle what I have to process, some ideas?


{:body (io/input-stream (.getBytes "foo")) :status 200} works.


Yes, but I want to write to the client while processing, not only when it ends, so the client does not have to wait all the processing


@ddeaguiar yeah the guide is fine, I was just trying to think of other ways to describe it so even someone thats not a programmer can understand, for my notes


@pfeodrippe can you elaborate on your use case?


@ddeaguiar I'll write a simplified code

(def streamed-interceptor
   {:name ::interceptor
    (fn [context]
      (let [pout (PipedOutputStream.)
            pin (PipedInputStream. pout)]
          (with-open [pout pout]
            (doseq [_ (range 10)]
              (Thread/sleep 1000)
              (.write pout
                      (.getBytes "Eita\n")
                      0 (count "Eita\n"))
              (.flush pout))))
        (assoc context
               :response {:status 200
                          :body pin})))}))


@pfeodrippe to clarify, why do you need an input stream?


Sorry, just saw it now, I have a function where I'm making requests to other endpoints and I want to notify the client right after the processing of 1 endpoint (of N)


So the client does not have to wait until the end of the request to get all the responses


If it's not clear, I can try to explain again 😃


Based on this code, I'd like to see a new line with "Eita" each second


I've tested with a lot of data and it was streaming each 32768 bytes... I'm now looking for a way to reduce this buffer size for the streaming response, any idea @ddeaguiar?


The docs say you can use a core.async channel for your response body to get the effect you want:


I have never used this mechanism though.


Yes, I have saw this, but we prefer more controlled threads instead of go blocks


You don’t have to use go blocks to return a channel of course, you can use the double bang fns and your own threads.

Braden Shepherdson16:04:26

I've seen one metaphor: passing through a succession of gates or checkpoints, maybe for a flight.

Braden Shepherdson16:04:36

some of the tasks are one-sided, others double-sided.

Braden Shepherdson16:04:00

eg. you check-in (`:enter` only), drop your bag (`:enter` of a double-sided one), go through security (`:enter` only), fly (the real handler), clear customs (`:leave` only), get your bag (`:leave` of the bag-handling interceptor) and done.

Braden Shepherdson16:04:43

actually, an even better metaphor there: the check-in process gives you a token (your boarding pass) that you show to several successive checkpoints (bag drop, security, at the gate)