This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2020-08-14
Channels
- # atom-editor (5)
- # babashka (6)
- # beginners (29)
- # calva (16)
- # cider (1)
- # clj-kondo (20)
- # cljs-dev (44)
- # clojure (29)
- # clojure-europe (19)
- # clojure-nl (8)
- # clojure-norway (7)
- # clojure-spec (2)
- # clojure-sweden (1)
- # clojure-uk (56)
- # clojurescript (32)
- # code-reviews (30)
- # conjure (24)
- # cursive (49)
- # datomic (4)
- # fulcro (31)
- # helix (3)
- # instaparse (4)
- # kaocha (100)
- # lambdaisland (2)
- # mid-cities-meetup (1)
- # monads (1)
- # off-topic (42)
- # pathom (13)
- # pedestal (6)
- # portal (5)
- # re-frame (6)
- # reagent (9)
- # reitit (11)
- # remote-jobs (1)
- # rewrite-clj (11)
- # shadow-cljs (44)
- # sql (22)
- # tools-deps (13)
- # uncomplicate (1)
- # xtdb (15)
I’d love to see any feedback on making this more Clojure-ish: I want to read data from a file that may or may not have a weird, useless compression algorithm applied to it. The compression algorithm is just to compress repeating characters, and it works like this: 0xFA
signals that the next three bytes are special - the first two get converted to a string, and this is (in hex) the number of times to repeat the character, and then the third byte is the character to be repeated. So if I’m reading a record that’s supposed to have 176 bytes, I want it to read it in, then expand any compressed sequences, up to 176 bytes and return a collection. Here’s what I came up with:
(declare read-bytes)
(defn inflate-compressed
[is nbr bvec]
(let [code (repeatedly 3 #(.read is))
len (Integer/parseInt (->> code
butlast
(map char)
st/join) 16)
chr (last code)
exp (vec (concat
bvec
(repeat len chr)))
remain (- nbr len)]
(fn [] (read-bytes is (if (neg? remain) 0 remain) exp))))
(defn compressed?
[b]
(= 250 b))
(defn read-bytes
([is nbr bvec]
(if (zero? nbr) bvec
(let [b (.read is)]
(cond
(= -1 b) (when-not (empty? bvec) bvec)
(compressed? b) (fn [] (inflate-compressed is nbr bvec))
:else
(fn [] (read-bytes is (dec nbr) (conj bvec b)))))))
([is nbr]
(trampoline read-bytes is nbr [])))
I’m using io/input-stream because even though most of the data is text, the 0xFA
character (250) screws thing up if I use io/reader - I don’t know enough about Java to know why, but it changes the character to 65533 (`0xFFDD`) when I try to read it. I feel like there are more conditionals than I’d like, but I’m not sure where to start getting rid of them, or if I’ve overthought this whole thing and there’s a simpler way to do it.In inflate-compressed
, I'd destructure the 2 chars as opposed to using butlast
and last
. And instead of vec
over the concat
, you can use into
. And (if (neg? remain) 0 remain)
becomes (max remain 0)
which can be done in the let
itself.
And in read-bytes
, I'd write the arities in increasing order of number of args, and replace the cond
with case
. is
can be typehinted to avoid reflection. And finally, in both inflate-compressed
and read-bytes
, you could use reader syntax for the returned anonymous fns.
Applying all of the above, we get this:
(declare read-bytes)
(defn inflate-compressed [^java.io.InputStream is nbr bvec]
(let [[n1 n2 chr] (repeatedly 3 #(.read is))
len (Integer/parseInt (str (char n1) (char n2)) 16)
exp (into bvec (repeat len chr))
remain (max 0 (- nbr len))]
#(read-bytes is remain exp)))
(defn read-bytes
([is nbr] (trampoline read-bytes is nbr []))
([^java.io.InputStream is nbr bvec]
(if (zero? nbr) bvec
(let [b (.read is)]
(case b
-1 (not-empty bvec)
250 #(inflate-compressed is nbr bvec)
#(read-bytes is (dec nbr) (conj bvec b)))))))
This is of course, a surface level refactoring of your code. read-bytes
still does too much IMO. It's dealing with the InputStream, making a vector, making sure we only get nbr
number of bytes, and doing the decompression. Of all of these things, we can separate and implement just the decompression algorithm as a https://clojure.org/reference/transducers:
(defn decompress [xf]
(let [filled (volatile! -1)
buf (volatile! 0)]
(fn
([] (xf))
([result] (xf result))
([result b]
(let [n @filled]
(case (int n)
-1 (if (= b 250)
(do (vreset! filled 0)
result)
(xf result b))
0 (do (vreset! filled 1)
(vreset! buf (* 16 (Character/digit (char b) 16)))
result)
1 (do (vreset! filled 2)
(vreset! buf (+ @buf (Character/digit (char b) 16)))
result)
2 (do (vreset! filled -1)
(reduce xf result (repeat @buf b)))))))))
Of course, this doesn't look very readable, and that's probably because of my incompetence, but this doesn't do all of that other stuff, which inherently makes it more understandable.
This way, we can make take
do the job of ensuring we stop when we've consumed nbr
bytes. And my take on doing that, is this:
(defn reducible ^clojure.lang.IReduceInit [^java.io.InputStream is]
(reify
clojure.lang.IReduceInit
(reduce [_ f init]
(loop [state init]
(let [b (.read is)]
(if (= b -1)
state
(let [result (f state b)]
(if (reduced? result)
@result
(recur result)))))))))
(defn read-bytes [^java.io.InputStream is nb]
(let [xf (comp decompress (take nb))]
(into [] xf (reducible is))))
The reducible
fn here only deals with the InputStream and returns a thing we can reduce over. And we do that using into
, and as that reduction happens, we transform the bytes using the comp
osition of decompress
and (take nb)
transducers.I’m blown away by the transducer. I don’t think that ever would have occurred to me. In fact, the density of new concepts here is amazing and I’m going to have to save this to refer back to. Thanks so much for the feedback!
@jaihindhreddy After trying to implement this I have one issue - if I try to read n bytes and a compressed sequence happens to end at n, for some reason the input stream reads a byte and throws it away, but I’m not sure why. What should I be looking at to prevent that from happening? I’m confused about this, because take
calls ensure-reduced
when it’s done, and I thought that was supposed to stop all processing and return the result. When the decompression happens in the middle of a group of n bytes, everything works correctly, but I wonder if that’s coincidental.
You know what, I think this is because I’m trying to access the input stream between calls. So the “extra” byte is there in the reducer ready to go, but if I mess with the input stream, everything gets screwed up and I start losing bytes. Of course I realize this shortly after I get frustrated enough to come back and ask. 😐 I guess the takeaway here is that accessing something mutable in multiple places without synchronization is asking for trouble, which I should already know…
(when-not (empty? bvec) bvec)
=> (not-empty bvec)
I found the if
in read-bytes
hard to read because bvec
is on the same line as the condition -- don't do that: it looks like a single-arm if
and my first thought was "Use when
!" then I was puzzled about it returning nil
... until I saw that bvec
in there!
The use of trampoline
here feels weird. I would try to rewrite it without that, and just use a loop
instead maybe?
Thank you! not-empty
was exactly what I was looking for there. I see your point about the formatting on the if - that part began life as part of cond
until I realized that I needed to check for zero and get out before I read a byte that I might potentially just lose. This whole thing actually started as a loop, and then when I added support for the “compression”, it looked like that needed its own function, and then it looked like two functions that call each other, so that’s why I went with the trampoline. But since there’s no way to exit inflate-compressed
without returning a function, it’s true that it doesn’t seem to quite fit. Seems to me that a trampoline properly goes between two functions that can each either return either a function or a value. Now that I’ve got it in a form that’s working, I’ll see about how I might get that back into a loop.
Curious if about thoughts regarding improving something like this, or other alternatives to implementing a stateful iterator in clojure. I know that philosophically it could be considered at odds with the language, but if you had to do it, what’s the best way?
@ndonolli As it stands, that isn't thread-safe. Also (iter!)
is not valid (I think you mean (iter! 1)
?)
@seancorfield Thanks, would you mind elaborating on why it isn’t thread safe? I was playing with this code in a clojurescript context so I might not see that behavior
(def iter! (stateful-seq (range)))
(future (iter! 1)) (future (iter! 1)) ; could both return (take 1 @cursor) before either swap! occurs
That could give you 0 0
or 0 1
or even 1 0
Since you want it to be atomic/thread-safe, I'd probably use split-at
and use a pair in the atom
.
(defn stateful-seq [s]
(let [s (atom [nil s])]
(fn [n]
(first (swap! s #(split-at (if (and (number? n) (pos? n)) n 1) (second %)))))))
Seems to work.
thanks, that makes sense. Thank you @seancorfield!
you can also use the lesser known swap-vals!
:
(defn stateful-seq [s]
(let [cursor (atom s)]
(fn
iter!
([]
(iter! 1))
([n]
(let [[old _] (swap-vals! cursor #(drop n %))]
(take n old))))))
and on principle, I think it's better to crash on an invalid input than to silently coerce the invalid input into some valid one
Is this a good way to make a reducible thing from a
?
(defn reducible ^clojure.lang.IReduceInit [^java.io.InputStream is]
(reify
clojure.lang.IReduceInit
(reduce [_ f init]
(loop [state init]
(let [b (.read is)]
(if (= b -1)
state
(let [result (f state b)]
(if (reduced? result)
@result
(recur result)))))))))
when you have something like an InputStream that might be holding onto system resources, you want to make it easy to manage resources. I'm not sure what making an input stream reducible buys you. check out https://juxt.pro/blog/ontheflycollections-with-reducible#_it_gets_useful_when in that example, the reducible wraps the resource and makes sure that the resources get released.
Thanks!
@smith.adriane Ah, yes, I think I like that better (and I agree about just letting it crash on a bad n
value).
adding an assert
or :pre
condition wouldn't hurt
and zero!
user=> (def iter! (stateful-seq (range)))
#'user/iter!
user=> (iter! -2)
()
user=> (iter! -2)
()
user=> (iter! 1)
(0)
user=> (iter! 10)
(1 2 3 4 5 6 7 8 9 10)
user=> (iter!)
(11)
user=>
(even for an infinite range
it still works nicely)
@jaihindhreddy After trying to implement this I have one issue - if I try to read n bytes and a compressed sequence happens to end at n, for some reason the input stream reads a byte and throws it away, but I’m not sure why. What should I be looking at to prevent that from happening? I’m confused about this, because take
calls ensure-reduced
when it’s done, and I thought that was supposed to stop all processing and return the result. When the decompression happens in the middle of a group of n bytes, everything works correctly, but I wonder if that’s coincidental.
You know what, I think this is because I’m trying to access the input stream between calls. So the “extra” byte is there in the reducer ready to go, but if I mess with the input stream, everything gets screwed up and I start losing bytes. Of course I realize this shortly after I get frustrated enough to come back and ask. 😐 I guess the takeaway here is that accessing something mutable in multiple places without synchronization is asking for trouble, which I should already know…