Fork me on GitHub

Given Rich’s apprehension about positional parameters, what’s the reason for not having keyword parameters first class with fn/`defn`? Is giving it a map + destructuring considered first class/good/obvious enough that we should be prioritizing it? Does it generate overhead? The standard library is more or less devoid of this pattern, why? (Re-watching Effective Programs, where positional parameters aren’t exactly praised)


user=> (defn foo [& {:keys [x y]}] [x y])
user=> (foo :y 1 :x 2)
[2 1]


I’m aware 👍


you can express keyword parameters this way, explicit maps are preferred because they compose


you can (merge default x) a map, not so much with kw args. also you need apply-kv etc

Alex Miller (Clojure team)15:11:47

we've talked about expanding options in this area


Color me intrigued.


@bronsa I’m not saying that there isn’t a way to express it, just that I don’t see it happening very much, and that the standard library (which I use as a personal reference for what a Good Function looks like, mostly doesn’t)


clojure.core does use kw args on macros, mostly


see for/`doseq`/`core.async`/`condp` etc


there used to be a doc on confluence about why maps were (generally) preferred over kw args


no idea where that's gone

Alex Miller (Clojure team)16:11:16

you might be thinking of this list of advice (although it doesn't contain that)


oh yeah that's the one I was thinking about


is confluence not a thing anymore then?

Alex Miller (Clojure team)16:11:17

clearly you weren't missing it :)


once a year is about the frequency i checked it

Alex Miller (Clojure team)16:11:49

so you were a heavy user then

Alex Miller (Clojure team)16:11:07

the maintenance required had surpassed its usefulness :)


I used to favor "keyword arguments" (and in an ideal world I still would), but their performance is inferior as some gist showed. It had to do with apply IIRC So an 'options' argument isn't tremendously different (in terms of PLOP-ness), but it's more performant. And in addition to the mentioned merge-ability, IDEs can vertically align options more easily than kwargs.


Wouldn't the performance depend entirely on how the code was processing them?


Hum.... interesting. Did some quick benchmark and they are slower.

👍 4

For most code it won't matter. But I guess if you were concerned keeping their use to macros and using map options for functions would make sense.


> For most code it won't matter obviously a very subjective topic :) for example at work we have essentially zero reflection warnings, other than those brought in by external deps. In other aspects we also default to performance; why would we have ruby-like perf when we can have java-like perf instead? Especially when it's a few keystrokes away. As mentioned in my OP I actually prefer kwargs from an aesthetic point of view, but eventually I just had to swallow the pill of performance, however bitter :)

👍 4

I don't disagree, if you have performance needs. At my work we care more about scale, a few extra ms per request doesn't matter much. But if we had a hot loop making a ton of repeated calls to such fn, and we saw it was affecting our SLAs, we'd probably do some performance analysis and that could show that a switch to maps would make things faster


Ruby is still orders slower though, even compared to non optimized Clojure code


I also don't disagree with you ;p There's some power in having defaults that completely kill a) subjective arguments, and b) the possibility of ever having to profile a specific kind of problem. i.e. I'd rather make microinvestments today than possibly having to go on a perf hunt tomorrow


I'm curious, apart from type hinting to get rid of reflection and using maps over kargs. Is there anything else you try to avoid ahead of time beyond these?


Off the top of my head, not much. We try to favor (into [] (map f) ...) for avoiding intermediate collections, but honestly only if it's not too hard for the given case The most interesting thing we do is a perf-specific pass in the code review process ( ). In practice we only apply it only in big PRs/features since otherwise it's too expensive to go checkbox by checkbox for smaller PRs.


Nice checklist!


Ya, I was wondering if you also tried and used transducers instead of lazy seq


By the way, though I never benchmarked it, I think for a case like (into [] (map f) ...) you can just use mapv and I think it be just as fast.


yes, we use mapv/filterv, the transducer fanciness is only for when those can't do


e.g. for building sets


> I was wondering if you also tried and used transducers instead of lazy seq relatedly, recently I was wondering: for the following defn:

(defn foo [coll]
  (filter even? coll))
would be sane default be filter, filterv, or... no coll arg at all? i.e. return a transducer. Be unopinionated, letting people be lazy or eager as they please


(of course, the snippet is trivial, but in practice many business-specific defns are like that)


Well, since by sane you mean performant, I think transducer would be best


Since they can do loop fusion when composed, which filterv won't


Even best if you support both transducer and lazy, with the overload. In case people need true lazy for computations that don't fit in memory.


Like when called with a coll its lazy, when called without it returns a transducer


Its more work though, so if you know one will never be needed I wouldn't bother


> Like when called with a coll its lazy, when called without it returns a transducer ah, interesting idea on mimicking clojure.core's signatures. That way one doesn't suprise people too much


I think I'll give a shot to always returning transducers and see how that works out


In general I find functions returning functions always makes things harder


Higher order functions overall add quite the cognitive load. I find humans aren't good at thinking in such higher orders. I think when designing functions that uses them, you must be sure that their use is either self-contained, or encapsulated behind another simpler interface.


But when people get real clever with them, it can be quite challenging


Someone should coin a pasta for that kind of code... maybe Cannelloni code, code that makes use of too much higher order functions.


There's already Spaghetti for code that's too interweaved, Lasagna for code that has too many useless layers, Ravioli for code that has too many small functions. So I think Cannelloni could work for code that gets too clever with higher order functions 😋

🍴 4

"flour code" -- code that doesn't produce pasta. It only eventually becomes pasta when combined with other code later.


who knew coding was so isomorphic with Italian food

😄 4

functions returning functions seems bit of a trivial case (and a design that is necessary for e.g. the middleware pattern), but I see what you mean the usual suspect would be something like chaining juxt and comp... I think I would appreciate a detailed/nuanced essay on the topic. I have seen articles suggesting HOFs are outright wrong. Quite obviously that's not going to cut it among functional programmers :)


In this case its essentially a partial where one of the arguments is expensive to calculate


Middleware and Interceptor are both use of them that I consider "encapsulated behind an interface". Same for transducers etc.

👍 4

Like they're very well contained use with rules of how to use them and do they don't leak anywhere else


And map, for, reduce are self-contained


But I've seen code that for example, should be using a multi-method or polymorphism, but instead it uses HOF.


Or say, something that takes an fn that takes an fn and returns an fn.


Mixed with some config file where you can put the fn name you want and some dynamic calls to resolve 😋


a lot stuff written as multimethods or protocols would be better off passing a function


Hum... interesting


Any reasoning?


I thought generally the Clojure community agreed about data > functions > macros


multimethods and protocols are all entangled in global systems


passing a function is an entirely local decision


if I want to run two copies of some code and have them behave differently, it is easy to pass in a different function to each copy, it is harder to get that working when it relies on global state


Hum... that's true. But why do you need two copies, that are both local only?


it happens a ton where you create something that does X, business requirements shift so now you need X and Y, and the thing that does X could also do Y if it was just run with different parameters


I find I prefer my functions to be global, have a name, a doc-string, have been tested, etc. And unlike state, they don't suffer from coordination issues.


libraries clash over who extends what protocol to what type


what the behavior of a global multimethod should be for a given dispatch value


Hum... what about metadata extends?


who calls which protocols and is responsible for aot'ing what


metadata extends is an attempt


but nothing is as general as a function


lambda the ultimate: ...


I mean, passing in a function to another function is just creating your own custom dispatch


your own private non-global unshared


No, the function that takes the fn as an argument is the dispatch fn in that case. It takes an fn and choses when to call


But anyway, I do see your point


are you saying it wouldn't choose when to call protocol functions?


I've never had clashes with protocol extensions or multi-method extensions. But I can see how it could happen


anyway, I write protocols, I write multimethods, but nothing has given me the code reuse and interoperability that passing functions does


Why do you still use multi-methods though?


I think you probably just have the intuition to know when HOF are appropriate and when they would make things more complicated then they are worth.


generally I don't write them as external extension points, so they aren't like the "interface" for something, internally they do dispatch on input data to do whatever


Its definitely not a problem I had thought of before. That two things could extend the same protocol to a type for the same fn or the same multi-method for the same data


I'll need to think more about it


I think my issue with using fn instead is their loose definitions. Like if you wanted to know what are all the possible fns being passed in? Say you were reading a new code base, with HOF, that's a lot harder to figure and then understand the possible branches of logic.


Even Protocols and multimethods are open systems. You won't (and can't) know all implementations of the abstraction. Pattern matching is more appropriate in places where you "know" beforehand that the possible set of impls is closed.


Most of the times, when I think hard enough about it, most things I thought were closed are really open, by essence.


I'm talking when reading code. It's easy to find all defmethods.


So understanding the cases from reading a code base is easier.


You know the pattern of extension, know where to look and what to look for.


And generally. In a given code base, people will put all their multi-method extensions in the same place


But I guess with discipline, that problem could be eliminated. Like if the passed in HOF are all defined with defns in a common place, with doc and tests.

David Pham22:11:07

Passing around lambda function might be hard to debug though.