Fork me on GitHub

My use case was {:ref my-atom} in reagent. Today we have to create our own functions to reset! them, and it is annoying both for syntax and perf. I figured it would be convenient to use them as functions in such cases if there weren't any big downsides, but I see now why it may be a bad idea in general.


In reagent reset! is common, so yea, I may just make a new mutable container with that behavior for IFn to test it out


It sounds just like the already existing React/createRef that's intended to be used with :ref.


Oh yea, forgot about those


As far as making it stand out, @smith.adriane, I see your point, but there are other ways to do that, like naming convention. For example, I've seen some people name all their atoms like !my-atom, or similar

Ludger Solbach12:08:51

{:deps {org.clojure/clojure {:mvn/version "1.10.1"} ; FOP 2.4 and 2.5 depend on [ "1.1.3"] which is not OSS org.apache.xmlgraphics/fop {:mvn/version "2.5" :exclusions []} org.apache.xmlgraphics/fop-core {:mvn/version "2.5" :exclusions []} org.soulspace.clj/ {:mvn/version "0.8.0"}}}

Alex Miller (Clojure team)20:08:39

the trace you posted doesn't show those exclusions getting included in the path so I may be confused what you're asking about

Alex Miller (Clojure team)20:08:54

are you just saying that you need to specify the exclusions on both top level libs to avoid the exclusion? if so, then that is the expected behavior currently (there are no global excludes)

Ludger Solbach12:08:41

any idea, why it is not enough to have the exclusions on fop? If I don't include fop-core with the exclusions, the jai stuff will be included. 😞

Alex Miller (Clojure team)13:08:13

can you use -Strace and post the emitted trace.edn file here as a snippet?

Ludger Solbach14:08:03

Here is the trace.edn with the fop-core dep commented out


is there a library that can take a datomic/datascript-ish schema, and use it to flatten nested maps/"entities"?


#meander might fit the bill?


hmmm might be useful thanks


@jjttjj what about datascript? 😛


I'm starting to play with asami


(which is like DS but lacks schema)


I came across this fairly common implementation to “dechunk” a chunked seq to realize only 1 item at at time. The impl is typically this:

(defn dechunk
   (when-first [x xs]
     (cons x
           (dechunk (rest xs))))))
I found this to be interesting, so tested it out first with:
(first (dechunk (map prn ( range 10))))
Which printed out all 10 times - so dechunk didn’t seem to help. I did notice, more obviously, this did work:
(first (map prn (dechunk (range 10))))
So then I thought, you just cannot do any fn that may realize the chunk before you have wrapped in dechunk. However, then I noticed this:
(first (map prn (map inc (dechunk (range 10)))))
this ends up working again - only prints 1 item. Same goes with
(first (map prn (filter even? (dechunk (range 10)))))
Any explanation?


dechunk can't change map's chunked behavior if it doesn't wrap map's input


IMHO if you need dechunk you are better off swapping in explicitly imperative logic instead of your lazy code somewhere, lazy-seqs are not good queues, but agents and channels are


pretending something imperative isn't imperative leads to bugs, and agents and channels are very nice imperative abstractions


more clarity about that first part: map detects a chunkable input and uses a performance optimization that calculates a chunk at a time, dechunk returns a new lazy-seq that isn't chunkable, and map doesn't return a chunkable either


in fact that makes me wonder why one doesn't replace dechunk with (partial map identity) (except for the design concerns already stated indicating you shouldn't be using laziness in the first place)


because map will build a chunked sequence if passed one in


so (partial map identity) is chunked seq in -> chunked seq out, it dechunks nothing


oh - I guess I misunderstood his last part then


every lazy-seq operation returns a new lazy seq that refers to the previous, so as you force it you force the previous


so dechunking a lazy seq returns a new lazy seq that is not chunked, but it can't go back and make the previous lazy seqs not chunked


so if you look at all of the examples above, when the prn is after the dechunk you get one at a time, when prn is before you get multiple

👍 3

I still don’t understand why it works/doesn’t work in some cases. prn isn’t really special here I’d think.


Chunking is a property of how seqs are constructed, not how they are consumed


Dechunk consumes a sequence and returns a new sequence built from elements of that consumed sequence


Consuming a chunked sequence will realize in chunks, no matter how you consume it, because that is how it is constructed


So dechunk gives you a sequence that chunking aware operations (map, for, etc) will not treat as chunked


But if the input to dechunk is chunked, that sequence can only be realized in chunks, and the chunks pulled apart


@U0NCTKEV8 good explanation and sort of what I was thinking. So doesn’t this mean that dechunk isn’t really a solution?


It’s posted all over the place online. Stackoverflow, blogs, etc. but really if you are given a chunked seq. You can’t make it suddenly not chunked anymore


Is that correct? It seems to align with what you’re saying. You are only a consumer at the point of calling dechunk. If the seq given is chunked, you will still realize it in chunks


It’s almost like you can cause chunk aware things later to no longer be chunked. But you can not prevent the original chunked seq from being realized in chunks. So it could have some uses depending on what was trying to be dechunked.


Yes, it can be useful, you just need to call it as soon as possible (e.g. immediately on range or a vector) to make sure your side effecting operations happen one at a time


Indeed. Interesting. I just came across this sort of thing being used somewhere and became curious. I try to avoid if possible, needing this type of thing.


(->> (range 10)
     (map prn)

(->> (range 10)
     (map prn)

Michael J Dorian22:08:35

Is there a way to compare too maps, using == instead of = for any entries that happen to be numeric?


you could do it by mapping over two tree-seqs I bet...

Michael J Dorian22:08:19

Yeah. Just checking if there's a better way! It seems like a common use case, but then I guess this is the first time I've needed it 😁

Michael J Dorian22:08:13

tree seq looks nifty!


can't compare a map with seqs though


cause maps have no order 🎵


what I have so far uses (comp sort seq)


but I can't just use sort because keys are heterogenous

Michael J Dorian22:08:53

Well, here's an idea... is there a way I can just cast all numbers in a map to double?


haha, that's easier: clojure.walk/postwalk


much easier

(ins)user=> (walk/postwalk #(if (number? %) (double %) %) {:a 0 :e {:b 1}})
{:a 0.0, :e {:b 1.0}}


walk should really have a (defn transform-type [pred f] (fn [x] (if (pred x) (f x) x))) type thing

Ben Sless06:08:06

Arity question regarding a more generalized version of this function:

(defn maybe-apply
  [pred f x]
  (if (pred x)
    (f x)
Would it be better for x to be the first or last argument? Why first? ->, which is coherent with how other value transforming functions work (assoc, conj, etc)


pred, x, value goes from most general to most specific, making it friendly to partial


if you want threading, this is nearly cond-> which can be used inside ->


I feel like I've written so many variations of that, and always for clojure.walk

Michael J Dorian22:08:42

Hey! Passing test!

Michael J Dorian22:08:05

That's great, I'm definitely saving this code snippet to an accessible location