Fork me on GitHub

it's sad because it puts the chained exception within the ex-data in addition to chaining it as the ex-cause I end up doing everything Alex mentioned minus setStackTrace


preserve the chained cause & ex message, but dissoc stuff from the data


for that case it doesn't matter that the outermost exception's stacktrace differs, the real meat of the stacktrace is on the chained cause


aaah, I was trying to set! the data field (which obviously didn't work), but .setStackTrace might do it. Especially if it's as simple the example @hiredman pasted.


Thank you all very much!

Michael Lan01:04:34

Within a REPL session, is it possible to add a new dependency? Using Deps and CLI.

Alex Miller (Clojure team)01:04:16

there is an experimental add-libs3 branch in tools.deps.alpha that provides this functionality. we expect it to eventually be included but there are some key integration questions we still have. Sean Corfield's deps.edn has some setup info for this


That lets me add all of next.jdbc’s test dependencies into a running REPL that was started from another project (that may well depend on next.jdbc, but without those deps). This lets me run next.jdbc’s tests from my editor, even when working on another project.

Michael Lan01:04:58

Got it, so it’s not a permanent addition, just temporary


It just loads libs into the REPL. If you want them present the next time you start the REPL, you need to add them to deps.edn.


In one of my RDD talks I show how you can edit deps.edn to add dependencies and then call add-libs on that same hash map from within deps.edn (by having some code in deps.edn that is normally commented out).


(I am tempted to automate that so I can just edit deps.edn, put my cursor just inside the :deps map or :extra-deps map, and hit a hot key and have add-libs called on that…)

Michael Lan01:04:23

haha, that sounds really convenient


Just added it to — you need the :add-libs alias from my dot-clojure repo active when starting a REPL and then you can just edit the :deps or :extra-deps in your deps.edn file and with cursor inside the lib spec hash map, ctrl-; shift+a and it sends it to add-libs and loads those dependencies!

Yehonathan Sharvit06:04:02

When I have a namespace with a couple of defmethod declarations, what’s the best way to make sure the namespace is loaded (I mean required)? For instance, let’s say I have a library whose main ns is my-lib.core and the defmehtod declarations are in Should I write something like?

(ns my-lib.core
The problem is that when someone looks at the ns form, seems superfluous. And in fact cider-refactor removes it as it considers it as a unused libspec vector


I'd like to write code that behaves differently depending on the user's Java version. More specifically, I'd like to write a function that uses java.lang.module.ModuleFinder if it's available, and just returns nil if not. Any general pointers on how to approach this? If you can point me to any prior art on this topic, I'd appreciate it.


Thanks! I'll look into that.


Thanks! That looks nice and straightforward.

Alex Miller (Clojure team)12:04:47

Note that that code doesn’t exist in core anymore

Alex Miller (Clojure team)12:04:15

But feel free to borrow it


Sure. I borrowed the gist of it, that got me there.

flowthing12:04:19 The mapcat fn still reflects, and I'm not sure how to hint it when java.lang.module.ModuleReference is not guaranteed to exist, but it's not really a big deal.

Alex Miller (Clojure team)12:04:31

you could use a fn and type hint the argument


Yeah, that's what I'd like to do.


I figured I could maybe use the same kind of type hint as with string arrays etc., but it didn't seem to pan out, or then I just didn't get it right. The "syntax" is a bit fiddly.

Alex Miller (Clojure team)13:04:36

I think the compiler is going to lead you into trouble in that case instead


What is the cleanest/most idiomatic way of taking an existing map with plain keywords and turning it into a map with namespace qualified keywords?

{:a 1 :b 2} => {:foo/a 1, :foo/b 2}


I am using rename-keys which works, but I am positive there’s a better mechanism for this


There was a reduce-kv based solution on stackoverflow which is similar to my rename-keys approach…


@srijayanth creating a map-vals or map-keys is like an initiation ritual for a Clojure developer since they don’t come with the standard library… 😛 I like to for to iterate through the key-value pairs and then into to create the new map


(into {} (for [[k v] {:a 1 :b 2}]
           [(keyword 'foo k) v]))


np. I wrote that piece of code so damn many times


fyi, as noob as my question sounds, I’ve been using Clojure since 1.2 😄


I still don’t get why rename-keys is in clojure.set. I remember somebody providing an explanation that seemed adequate at that time

Yehonathan Sharvit08:04:21

It’s because rename-keys is part of relational algebra and the natural way to represent a relational in Clojure is as a set of maps.


don’t worry, I’ve never even used a transducer… 😛


I love transducers. I wrote this simple version of 2048 that uses transducers


yeah I know they’re great, I just don’t like how it makes my code look and I never had any real performance issues, so I never bothered to use them 😛


probably should get over it and use transducers everywhere applicable


Even without performance issues, they come in handy.


It is a lot easier to compose transducers and pass them around


right… do you have a recommended article or something for me to get into the right mindset?


I’ve seen them around in codebases I’ve been working on, so I’ve edited them… just never ran into a situation that made me think “that calls for a transducer!”


Btw. for map-vals et al I really like to use medley library. It also has map-keys:

(map-keys #(keyword "foo" (name %)) {:a 1})
;;=> #:foo{:a 1}


@jumar it’s such a tiny bit of code that pulling in a library just seems overkill 😉 but I guess if you use medley for other things, it makes sense


yeah, I use map-vals , find-first, index-by and assoc-some most frequently. What I like about map-vals in particular is that it has a lot more meaningful name than just using the idiom (although when you get used to it it's also quite readable) - another aspect could be performance but I don't care about that one in most cases.


@simongray - when transducers came out, I was a little befuddled about how they work. There’s a Rich Hickey talk where he exposes the insides of a transducer(it isn’t the StrangeLoop one I think). After that, every time I see code that has a series of 2 or more map/filter/reduce/iterative constructs, I eagerly convert them to transducers


We once had a discussion here about any scenario where a transducer might not be preferred. While someone pointed out something around eagerness/laziness etc, I came away feeling that there’s very few downsides to using transducers at all


For instance, here’s the transducer chain for that 2048 bit

(def xform (comp (remove zero?)
                 (partition-by identity)
                 (mapcat #(partition-all 2 %))
                 (map (partial apply +))))


And I end up using it as follows

(defn move-left [row]
  (take 4 (concat (sequence xform row) (repeat 0))))


If you see the xform, it reads incredibly clearly from top to bottom


remove zeroes, partition equal numbers together, chunk them in 2s and sum them up


I eventually wrote my own stateful transducer that pads the ends of collection with zeroes. I stuck that transducer to the end of the chain


thanks a lot


I guess I just need to get into the habit of converting most functions that contain a threading macro into composed transducers


Yeah, my personal limit is more than 2. if there’s more than 2 maps/filters in sequence, then that’s an ideal candidate for transducers


It really is a shame that not enough people use it. I find it awesome


I transduced the hell out of those advent of code problems


I like this discussion about when to use transducers:!topic/clojure/JjiYPEMQK4s I think it's very helpful to understand the use cases even without knowing how exactly they work. Especially Alex Miller's points: > I would say transducers are preferable when: • 1) you have reducible collections • 2) you have a lot of pipelined transformations (transducers handle these in one pass with no intermediate data) • 3) the number of elements is "large" (this amplifies the memory and perf savings from #2) • 4) you put to produce a concrete output collection (seqs need an extra step to pour the seq into a collection; transducers can create it directly) • 5) you want a reusable transformation that can be used in multiple contexts (reduction, sequence, core.async, etc) 

👍 3

I rarely need reusable transformations, but maybe I just haven’t looked hard enough.


that’s what I mean about the mindset. Gonna scan that thread for some rules of thumb.


but obviously memory savings is advantageous too


and @srijayanth is right in that lazy collections are rarely called for in practice


They are useful as hell though 🙂


I can think of maybe 4-5 cases where I actually use lazy (usually infinite) collections


and I’m pretty sure I only remember them because they are noteworthy for being the rare infite colls


I end up using cycle a fair amount. It is sometimes surprising how often that pattern shows up


You can make pretty cool spinners with a simple cycle


yeah, in one case I use cycle to take N items from a randomised colour selection


to colour some tabs in a CLJS lib I make


and then iterate is another common case of infinite colls


(cycle "|/-\\")
A nice classic spinner 🙂


hah, nice one


the other way is to have different css styles/classes cycled


You can also make those annoying prehistoric marquees from the geocities days


Transducer chains aren't much harder to read than ->> chains so I would also prefer transducers even for not so performance sensitive code Premature optimization is one thing but wasting cycles for no good reason is another Like after Java 8 people realized that it makes no sense to go converting most loops to Streams because you probably lose 20-30% on performance and simple loops read just fine

Ben Sless11:04:12

Transducers also usually get you to stop and think for a few seconds, which is always good when programming. How many times did you find in code review someone used flatten when mapcat would have been fine? I often see it pop up when there's some confusion. While worrying about optimization prematurely is silly, so is shooting oneself in the foot from the onset


Yeah flatten is often a red flag and not because it is slow but because is a deep flatten even correct...

Ben Sless12:04:15

yup, like i said, it's usually a place where mapcat should be used, but somewhere the dev lost control of the return type - I saw it recently with a function which returned a sequence of maps, simplified some things, switched to mapcat and broke a test, because the test mocked the function to return a map instead of a sequence of maps. That function was then mapped on the input sequence, which made flatten work and mapcat not. It always feels good to fix wrong assumptions in tests on top of offensive functions

Ben Sless12:04:03

whenever I see flatten I read "I'm not sure what's going on here, yolo"


Personally, if there’s more than one transformation happening, I switch to transducers.

☝️ 3

@nilern you’re saying premature optimisation is a good thing?? 😛


Not being wasteful by default is quite far from needless micro-optimizing or complicated algorithms


anyway, you’re probably right


I actually thought that Java 8 streams were kinda transducer-like, i.e. they were more performant than regular loops


but obviously it’s not something I ever researched very deeply 😛


like I seem to remember something something parallelism, laziness, blablabla


so how do you deal with dependency cycles? I have this stateful "core" of what i'm writing at the moment, and an external service living two namespaces away, each depending on the previous. the service then needs to update the state. enter dependency cycle.


you can extract everything related to state, including functions to update, into separate namespace and then require it from every service and the core.


another option is to write code in terms of a protocol, have a shared protocol namespace, and a separate implementation namespace that most consumers never need to care about


If there is a lot of data you can utilize more cores with .parallel() But in the single-threaded case transducer/Stream pipelines only beat handwritten code if it is too gnarly to do everything in a single handwritten loop so intermediate collections are added


Not being wasteful by default is quite far from needless micro-optimizing or complicated algorithms


What’s better:

  (fn [idx x] [idx x])
  [:a :b :c]))
((comp vec map-indexed) 
 (fn [idx x] [idx x]) 
 [:a :b :c])


(assuming I’m not using transducers)


beauty is in the eye of the beholder 🙂

👌 3

Goal is to have a vector at the end.


In that case comp is just pointless obfuscation

🆗 3

Imo probably the first one, but there's also

(into []
  (map-indexed (fn [idx x] [idx x]))
  [:a :b :c])
(I'd still probably default to just wrapping the map-indexed call with vec)

👌 6

user=> (reduce-kv (fn [acc i v] (conj acc [i v])) [] [:a :b :c])
[[0 :a] [1 :b] [2 :c]]


Do you prefer that over all other options, in terms of readability or even shortness? Or you’re just showing other options 🙂


Or is this faster? (I haven’t done benchmarks on reduce-kv)


In terms of readability, I would not think about it at all and just extract it into a well-named function. :)

👌 3

If you really want to optimize (persistent! (reduce-kv (fn [acc i v] (conj! acc [i v])) (transient []) [:a :b :c]))


into does exactly that. :)


It uses normal reduce but yeah


If the goal was to produce a map then reduce-kv can avoid allocating those kv pairs

👍 3

Vectors are associative collections from an index to a value.


(into [] (map-indexed vector) [:a :b :c])


This gives a wrong result though. Ah, the edit. :)


Sorry about that


A map result could also be handy (zipmap (range) [:a :b :c])


I’m looking for the neatest code for an indexed vector for React components; map is no-go there.


Needs order.


And idx accessible on each one.


In that case, not handy


In any case, thanks for the input!


(mapv vector (range) [:a :b :c])


But beware that mapv is eager

Alex Miller (Clojure team)16:04:12

It was in #announcements yesterday


Fogus wrote this one. I am going to read tea leaves and say that Alex was busy getting spec 2 ready. Sometime this week? 😉

🙏 9

Thanks for publishing the survey and the write-up @fogus!



user=> (keyword "" (str 100))
user=> :
Syntax error reading source at (REPL:2:0).
Invalid token: :


I can obviously write a fn to normalise this, but is there any real way to deal with numeric keys?

Timur Latypoff16:04:30

I remember someone from the Clojure team was saying that keywords should not start with a digit (like symbols), but the ability to construct them has been retained for backward compatibility.


The problem I have is that I am receiving a json that I am then namespacing


As I said, I can always use a fn to prefix something

Timur Latypoff16:04:40

You can use non-keyword keys (string or integer keys) just fine, if that works for you @srijayanth


yes, I’ll just add a prefix

Michael Gardner18:04:28

I seem to recall Rich saying that he might've used transducers as the basic abstraction for Clojure's collection APIs instead of seqs, if he'd thought of them back then. Am I remembering correctly? I'm very curious what that would look like, if so.


Sounds strange. Sometimes you really need first and rest -- comes to mind.


Parsing style usage is an extreme case but stopping processing early or processing multiple collections gets awkward. There are solutions like reduced or various flatmapping and zipping operations but even if the little extra allocations go away after inlining the pipeline abstraction stops being a friend at some point. I have used a lot of in OCaml but Java 8+ Streams seem similar.


@U01R1SXCAUX Yes I clearly recall this from a talk; I can’t remember which one but that sounds about right;


@nilern I am not so sure about rest ; perhaps it’s a coding style, but I don’t really use it often in day-to-day code; It does feel pretty low level to me (as in, not something I’d default to);


From the History of Clojure paper: > I think transducers are a fundamental primitive that decouples critical logic from list/sequence processing and construction, and if I had Clojure to do all over I would put them at the bottom.


Already in my PHP days I noticed that web apps are mostly a bunch of map, filter and doseq; even reduce feels quite unusual. Libraries and compilers are not so straightforward.

nilern18:04:29 put transducers more at the bottom; but it still has seqs


I don’t recall him saying he wouldn’t have seqs; I remember it was a short comment/sentence rather than a fully articulated argument; Don’t want to speculate on what the exact idea of transducers vs. seqs would be;


@nilern My language before Clojure was also PHP!


The 2nd best! 😝

🏆 3

> I was taught assembler in my second year of school. > It's kinda like construction work — with a toothpick for a tool. So when I made my senior year, I threw my code away, > And learned the way to program that I still prefer today.

😄 3
Michael Gardner19:04:10

yeah, that History of Clojure quote is likely what I was thinking of. I'll take a look at pixie


I think he said something along the lines of "I wouldn't have made sequences lazy by default if I'd thought of transducers first"

Ben Sless19:04:00

dedupe is a good example of how things would have looked differently had transducers been around from the start (probably)

Ben Sless19:04:26

The arity which accepts a collection is just:

([coll] (sequence (dedupe) coll))


I thought it might've been in this talk, not sure


@U01R1SXCAUX regarding: "I'm very curious what that would look like, if so." IMHO: In this fantasy, not too much would be different about the Clojure core (some of it could've been elided, though). clojure.core/into with a transducer is a great way to build a collection today (perhaps even The Best Way), but seqs are still more 'within reach' in some way that involves momentum the community has thinking with seqs, and also plenty of code written that way, more than it involves anything about the shape of the core libraries today. My intuition is that, in this fantasy world, when building collections almost everybody would be reaching for into + transducers now, instead of the ->> full of seq transformations, that seqs would be much less common, that the core would be a little smaller without e.g. the non-transducer arities of map,filter plus the core.async versions of the same... So the core could've been a little simpler that way, but I suspect we're not missing out on much.

Jan K20:04:17

Sometimes the transducer solution forces you to hide mutable state in them, where with reduce you could just use the state argument, which feels like "cleaner" FP. I wonder if it would be possible to reinvent transducers so they would get a state argument passed in (like reduce) to somehow avoid mutation inside.


Might as well slap an arbitrary monad in there while you are at it 😜


You could replicate something like that with a transducer like the xform library's reductions


you would need the reducing machinery to be aware of the state attached to the pipeline of transducers


If a local state is mutated in the forest and no one is around to observe it, does it make a sound?

😄 6

The state is not as encapsulated (in the sense of the ST monad) as say, the transients inside into but I think transduce, into et al. do encapsulate it so it is only a problem when calling transducers directly which basically only happens in library code


I do think that state in transducers does kind of make it harder to have a parallel transduce no? similar to the reducers fold?


I guess so although things like drop are not parallelizable anyway