This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-04-23
Channels
- # announcements (2)
- # beginners (82)
- # calva (13)
- # cider (12)
- # clara (4)
- # cljdoc (22)
- # clojure (89)
- # clojure-dev (23)
- # clojure-europe (16)
- # clojure-italy (39)
- # clojure-nl (8)
- # clojure-spec (28)
- # clojure-uk (36)
- # clojurescript (40)
- # cursive (10)
- # data-science (1)
- # datomic (27)
- # devcards (4)
- # emacs (1)
- # fulcro (25)
- # jobs (1)
- # jobs-discuss (3)
- # kaocha (5)
- # luminus (1)
- # nrepl (68)
- # off-topic (64)
- # pedestal (23)
- # planck (1)
- # quil (4)
- # re-frame (6)
- # reitit (5)
- # remote-jobs (4)
- # shadow-cljs (16)
- # spacemacs (11)
- # testing (1)
What’s a good way to visualize interceptors in general? I’m looking for a common analogy that even a non-tech person may understand.
@decim, perhaps this document will clear things up? http://pedestal.io/reference/interceptors
Hi, is there any example using InputStream at response body? I want to support streaming from it and I'm using a PipedOutputStream connected to a PipedInputStream, but it's sending everything only after everything processed. I'm using future to handle what I have to process, some ideas?
Yes, but I want to write to the client while processing, not only when it ends, so the client does not have to wait all the processing
@ddeaguiar yeah the guide is fine, I was just trying to think of other ways to describe it so even someone thats not a programmer can understand, for my notes
@pfeodrippe can you elaborate on your use case?
@ddeaguiar I'll write a simplified code
(def streamed-interceptor
(interceptor
{:name ::interceptor
:enter
(fn [context]
(let [pout (PipedOutputStream.)
pin (PipedInputStream. pout)]
(future
(with-open [pout pout]
(doseq [_ (range 10)]
(Thread/sleep 1000)
(.write pout
(.getBytes "Eita\n")
0 (count "Eita\n"))
(.flush pout))))
(assoc context
:response {:status 200
:body pin})))}))
@pfeodrippe to clarify, why do you need an input stream?
Sorry, just saw it now, I have a function where I'm making requests to other endpoints and I want to notify the client right after the processing of 1 endpoint (of N)
So the client does not have to wait until the end of the request to get all the responses
If it's not clear, I can try to explain again 😃
Based on this code, I'd like to see a new line with "Eita" each second
I've tested with a lot of data and it was streaming each 32768 bytes... I'm now looking for a way to reduce this buffer size for the streaming response, any idea @ddeaguiar?
The docs say you can use a core.async channel for your response body to get the effect you want: http://pedestal.io/reference/streaming#_using_a_core_async_channel_as_the_response
I have never used this mechanism though.
Yes, I have saw this, but we prefer more controlled threads instead of go blocks
You don’t have to use go blocks to return a channel of course, you can use the double bang fns and your own threads.
Great o/
I've seen one metaphor: passing through a succession of gates or checkpoints, maybe for a flight.
some of the tasks are one-sided, others double-sided.
eg. you check-in (`:enter` only), drop your bag (`:enter` of a double-sided one), go through security (`:enter` only), fly (the real handler), clear customs (`:leave` only), get your bag (`:leave` of the bag-handling interceptor) and done.
actually, an even better metaphor there: the check-in process gives you a token (your boarding pass) that you show to several successive checkpoints (bag drop, security, at the gate)