This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2024-06-06
Channels
- # announcements (1)
- # babashka (40)
- # beginners (33)
- # calva (4)
- # cljfx (1)
- # clojure (63)
- # clojure-europe (25)
- # clojure-nl (2)
- # clojure-norway (73)
- # clojure-sweden (20)
- # clojure-uk (5)
- # clojurescript (10)
- # datahike (10)
- # events (1)
- # fulcro (17)
- # graalvm (62)
- # hyperfiddle (12)
- # other-languages (19)
- # practicalli (8)
- # reagent (1)
- # releases (1)
- # squint (14)
- # yamlscript (2)
I’m curious wether there is any documentation around the introduction of the #(
reader macro. I’m sort of assuming that (fn [..] ..)
already existed, and wondering why this extra syntax was introduced?
It was there in Clojure 1.0, so you'd have to go back further than that to find anything about its introduction.
This is an early (first?) announcement of it https://groups.google.com/g/clojure/c/uDEbsBN_HpM/m/1kVI3MPs7pUJ
My read of the thread is that (fn [...] ...)
did exist first

Kind of ironic that this particular shorthand syntax was motivated by the Java side of Clojure's heritage, rather than CL (which only had lambda
s I guess?). Reverse of how I usually think about Clojure's heritage of concision.
It may help to remember, when considering the motivation for #(...)
, that anonymous functions were needed in more places back in the days before Clojure 1.12.
I met up with @U03B2SRNYTY and his friend yesterday for coffee in London (I'm visiting there) and talk about YAMLScript. YS uses a \()
syntax because using #
is problematic in YAML (comment char) and also because Haskell uses \
for anonymous functions since https://wiki.haskell.org/Anonymous_function
We were agreeing that it wasn't really that useful given (fn […] …)
.
@U0HG4EHMH can you elaborate on the 1.12 comment, for us younger Clojurians? 🙂
@U05H8N9V0HZ
> Clojure programmers often want to use Java methods in higher-order functions (e.g. passing a Java method to map
). Until now, this has required programmers to manually wrap methods in functions [i.e. #(...)
]. ... With this [1.12] release, programmers can now use Java qualified method symbols as ordinary functions in value contexts - the compiler will automatically generate the wrapping function.
https://clojure.org/news/2024/02/08/1-12-alpha6#method_values
Ah, dot interop stuff?
so (map #(Integer/parseInt %) xs)
used to be necessary, and now (map Integer/parseInt xs)
suffices
gotcha
thank you
you're welcome 🙂
FWIW @U05H8N9V0HZ I’m in agreement with you
> We were agreeing that it wasn’t really that useful given (fn […] …)
.
And this was in many ways why I posted the original post, “Why, when given the (fn […] …)
form, would one introduce #(…% …)
?” I would not have chosen to do so, and I’m somewhat surprised that Rich chose to introduce it as saves a couple of chars of typing at the cost of introducing new syntax and concepts.
Fun fact: In YS you can't use %
for %1
in \(…)
because I wanted %
to be the rem
operator:
$ ys -c -e 'a =: b % (c %% d)'
(def a (rem b (mod c d)))
> saves a couple of chars of typing at the cost of introducing new syntax and concepts :destructuring enters the chat:
IMO destructuring provides much more value, but doesn’t really introduce new syntax, as the destructuring thing is “just” a map, albeit with some specific keys and semantics.
(defn foo [{:keys [bar baz] :as qix}] ...)
Tells me that bar
and baz
are the important bits of qix
in this fn.I think it's a matter of taste. Concision is an important Clojure value and I appreciate not needing to name my parameters in #(...)
.
Destructuring feels to me like one of the largest areas of Clojure syntax. There are just so many options to combine in myriad ways. (And it's not just a map — sequential destructuring is a vector.) I see the benefits but it can get gnarly.
Yesterday we really got into destructuring. I personally (having used destructuring in many langs) find
(let [{:keys [a b c]} d]
and want to be able to
.{a b c} =: d
but then @U03B2SRNYTY introduced me to https://github.com/noprompt/meander?tab=readme-ov-file#meander%CE%B5
which is over the top!Totally agree that destructuring gets gnarly very fast. I very seldom do much more than one level of destructuring, I find it both unreadable and a sign that you’re not really encapsulating stuff well enough
Also read or watched somewhere (maybe Rich?) that destructuring can bind you too tightly to a given API.
Around here, we believe that destructuring is OK in moderation. After all, any construct can be overused to the point of illegibility... e.g., after forty levels of nested vectors, a person might lose count. Even water is bad, if you drown in it.
I use it whenever I can (shallowly) and have yet to drown.
I'm using tools.cli and have a boolean option --model-warnings
with a :default true
. Is it possible to set it to false at the command line? --model-wanings false
or --no-model-warnings
don't work. Any ideas?
[nil "--model-warnings" "Returns warnings for the loaded model" :default true]
in https://github.com/soulspace-org/overarch/blob/main/src/org/soulspace/overarch/adapter/ui/cli.clj
Overlooked it. I knew it should be possible somehow.
Could anyone please let me know if there is an open source tool to measure the Cognitive Complexity of clojure codes? https://www.sonarsource.com/blog/cognitive-complexity-because-testability-understandability/
In typescript ecosystem, I used sabik https://github.com/ytetsuro/sabik
It generate a report like this
You can try searching in #C06MAR553 and with https://phronmophobic.github.io/dewey/search.html. But FWIW I've never heard about such a tool for Clojure. Maybe Sabik can be adapted to Clojure, dunno.
Thanks! There is https://github.com/lokori/uncomplexor it measure Cyclomatic Complexity like thing. I may have to make it myself. It is on my "Someday/Maybe" list 😂
given how simple the rules are, it might be relatively easy to use something like rewrite-clj
's https://cljdoc.org/d/rewrite-clj/rewrite-clj/1.1.47/api/rewrite-clj.zip to traverse the forms and perform the counts.
that said, I think that a purely syntactic measure of "cognitive" complexity is not likely to correlate that much with how maintainable code is, even if it may sometimes be a useful rule of thumb for refactoring.
"Incent good coding practices" => "ignore features that make code more readable"...what? > "this operator makes things immediately clear, so it's ignored" At this point, I feel like I'm missing something. As if someone's telling me that plain water always tastes salty.
I would understand if "Cognitive Complexity" meant the understanding of what's going on at the conceptual level - not at the language level. But the section on screenshot talks about about "good coding practices" and... some other things.
Yeah...it's like "we like this feature and so to promote it's use we are going to ignore our criteria for it"
I don't even disagree with having a heuristic, but calling it "cognitive complexity" is too far.
?. Is controversial specifically because while it makes code easier to write it's a very small visual signal for a branch in logic. I.E. harder to read
Ah, then my understanding of the screenshot is probably wrong.
I thought it was saying that there's no distinction of the two versions in their cognitive complexity. Whereas I myself would say that the version on the right is less complex.
?.
is exactly the same as some->
and I myself don't see either of them as a branch in logic.
To me, it's more similar to multiplication - you don't expand *
into things like "multiply absolute values of all the arguments, count the number of negative signs, assign the negative sign to the result if the count is odd".
In other words, I view ?.
as a single operation that's conceptually different from an if
even if that if
is syntactically equivalent to ?.
.
It's the same with nil
punning. Or with NULL
propagation in SQL. And with probably many other things I'm not even aware of.
Imagine if we had a clojure dialect where for every function we could add ?. to the end to make it nil safe
That would be hard for me go gauge based on imagination alone because habits and pattern recognition take time to form. It might very well be something completely fine and maybe even something the other me would prefer over current Clojure if that dialect was the one he started with.
there is a big advantage in keeping things simple. Giving up the slightest bit of readability leads to some->
being an easily written macro as opposed to a change in the syntax specification of the language.
But there are also disadvantages, otherwise we would be all spitting out 1s and 0s which are the epitome of simplicity. :)
A comment to the edit - that's a problem not of the change but of the language. We have macros, many things are much, much easier for us.
and implementing the most language features possible as macros rather than at what effectively is the textual level makes it much easier to keep that advantage!