Fork me on GitHub
#pedestal
<
2019-04-23
>
v3ga02:04:10

What’s a good way to visualize interceptors in general? I’m looking for a common analogy that even a non-tech person may understand.

ddeaguiar13:04:55

@decim, perhaps this document will clear things up? http://pedestal.io/reference/interceptors

pfeodrippe13:04:48

Hi, is there any example using InputStream at response body? I want to support streaming from it and I'm using a PipedOutputStream connected to a PipedInputStream, but it's sending everything only after everything processed. I'm using future to handle what I have to process, some ideas?

souenzzo14:04:39

{:body (io/input-stream (.getBytes "foo")) :status 200} works.

pfeodrippe14:04:58

Yes, but I want to write to the client while processing, not only when it ends, so the client does not have to wait all the processing

v3ga14:04:33

@ddeaguiar yeah the guide is fine, I was just trying to think of other ways to describe it so even someone thats not a programmer can understand, for my notes

ddeaguiar15:04:15

@pfeodrippe can you elaborate on your use case?

pfeodrippe16:04:27

@ddeaguiar I'll write a simplified code

(def streamed-interceptor
  (interceptor
   {:name ::interceptor
    :enter
    (fn [context]
      (let [pout (PipedOutputStream.)
            pin (PipedInputStream. pout)]
        (future
          (with-open [pout pout]
            (doseq [_ (range 10)]
              (Thread/sleep 1000)
              (.write pout
                      (.getBytes "Eita\n")
                      0 (count "Eita\n"))
              (.flush pout))))
        (assoc context
               :response {:status 200
                          :body pin})))}))

ddeaguiar17:04:00

@pfeodrippe to clarify, why do you need an input stream?

pfeodrippe19:04:41

Sorry, just saw it now, I have a function where I'm making requests to other endpoints and I want to notify the client right after the processing of 1 endpoint (of N)

pfeodrippe19:04:16

So the client does not have to wait until the end of the request to get all the responses

pfeodrippe19:04:36

If it's not clear, I can try to explain again 😃

pfeodrippe16:04:56

Based on this code, I'd like to see a new line with "Eita" each second

pfeodrippe16:04:44

I've tested with a lot of data and it was streaming each 32768 bytes... I'm now looking for a way to reduce this buffer size for the streaming response, any idea @ddeaguiar?

donaldball17:04:52

The docs say you can use a core.async channel for your response body to get the effect you want: http://pedestal.io/reference/streaming#_using_a_core_async_channel_as_the_response

donaldball17:04:14

I have never used this mechanism though.

pfeodrippe20:04:35

Yes, I have saw this, but we prefer more controlled threads instead of go blocks

donaldball02:04:16

You don’t have to use go blocks to return a channel of course, you can use the double bang fns and your own threads.

Braden Shepherdson16:04:26

I've seen one metaphor: passing through a succession of gates or checkpoints, maybe for a flight.

4
Braden Shepherdson16:04:36

some of the tasks are one-sided, others double-sided.

Braden Shepherdson16:04:00

eg. you check-in (`:enter` only), drop your bag (`:enter` of a double-sided one), go through security (`:enter` only), fly (the real handler), clear customs (`:leave` only), get your bag (`:leave` of the bag-handling interceptor) and done.

Braden Shepherdson16:04:43

actually, an even better metaphor there: the check-in process gives you a token (your boarding pass) that you show to several successive checkpoints (bag drop, security, at the gate)