This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-11-18
Channels
- # aleph (4)
- # announcements (2)
- # babashka (48)
- # beginners (59)
- # calva (5)
- # cider (14)
- # clj-kondo (4)
- # cljs-dev (3)
- # clojure (77)
- # clojure-europe (6)
- # clojure-italy (6)
- # clojure-nl (5)
- # clojure-spec (4)
- # clojure-uk (67)
- # clojurescript (19)
- # clr (3)
- # cursive (7)
- # datomic (36)
- # duct (33)
- # events (3)
- # figwheel (1)
- # fulcro (4)
- # funcool (2)
- # graalvm (3)
- # jobs (1)
- # joker (25)
- # kaocha (1)
- # leiningen (45)
- # malli (17)
- # off-topic (103)
- # quil (1)
- # re-frame (16)
- # reitit (1)
- # rewrite-clj (27)
- # shadow-cljs (39)
- # spacemacs (3)
- # sql (11)
- # tools-deps (14)
- # vim (41)
Given Rich’s apprehension about positional parameters, what’s the reason for not having keyword parameters first class with fn
/`defn`? Is giving it a map + destructuring considered first class/good/obvious enough that we should be prioritizing it? Does it generate overhead? The standard library is more or less devoid of this pattern, why?
(Re-watching Effective Programs, where positional parameters aren’t exactly praised)
you can express keyword parameters this way, explicit maps are preferred because they compose
@bronsa I’m not saying that there isn’t a way to express it, just that I don’t see it happening very much, and that the standard library (which I use as a personal reference for what a Good Function looks like, mostly doesn’t)
there used to be a doc on confluence about why maps were (generally) preferred over kw args
you might be thinking of this list of advice (although it doesn't contain that)
or something in the design wiki https://archive.clojure.org/design-wiki/display/design/Home.html
paste fail on earlier one - https://clojure.org/community/contrib_howto#_coding_guidelines
not for like a year :)
clearly you weren't missing it :)
so you were a heavy user then
the maintenance required had surpassed its usefulness :)
I used to favor "keyword arguments" (and in an ideal world I still would), but their performance is inferior as some gist showed. It had to do with apply
IIRC
So an 'options' argument isn't tremendously different (in terms of PLOP-ness), but it's more performant. And in addition to the mentioned merge
-ability, IDEs can vertically align options more easily than kwargs.
For most code it won't matter. But I guess if you were concerned keeping their use to macros and using map options for functions would make sense.
> For most code it won't matter obviously a very subjective topic :) for example at work we have essentially zero reflection warnings, other than those brought in by external deps. In other aspects we also default to performance; why would we have ruby-like perf when we can have java-like perf instead? Especially when it's a few keystrokes away. As mentioned in my OP I actually prefer kwargs from an aesthetic point of view, but eventually I just had to swallow the pill of performance, however bitter :)
I don't disagree, if you have performance needs. At my work we care more about scale, a few extra ms per request doesn't matter much. But if we had a hot loop making a ton of repeated calls to such fn, and we saw it was affecting our SLAs, we'd probably do some performance analysis and that could show that a switch to maps would make things faster
I also don't disagree with you ;p There's some power in having defaults that completely kill a) subjective arguments, and b) the possibility of ever having to profile a specific kind of problem. i.e. I'd rather make microinvestments today than possibly having to go on a perf hunt tomorrow
I'm curious, apart from type hinting to get rid of reflection and using maps over kargs. Is there anything else you try to avoid ahead of time beyond these?
Off the top of my head, not much.
We try to favor (into [] (map f) ...)
for avoiding intermediate collections, but honestly only if it's not too hard for the given case
The most interesting thing we do is a perf-specific pass in the code review process (https://github.com/nedap/speced.def/blob/master/.github/pull_request_template.md ). In practice we only apply it only in big PRs/features since otherwise it's too expensive to go checkbox by checkbox for smaller PRs.
By the way, though I never benchmarked it, I think for a case like (into [] (map f) ...)
you can just use mapv
and I think it be just as fast.
> I was wondering if you also tried and used transducers instead of lazy seq relatedly, recently I was wondering: for the following defn:
(defn foo [coll]
(filter even? coll))
would be sane default be filter, filterv, or... no coll
arg at all? i.e. return a transducer. Be unopinionated, letting people be lazy or eager as they please(of course, the snippet is trivial, but in practice many business-specific defns are like that)
Even best if you support both transducer and lazy, with the overload. In case people need true lazy for computations that don't fit in memory.
> Like when called with a coll its lazy, when called without it returns a transducer ah, interesting idea on mimicking clojure.core's signatures. That way one doesn't suprise people too much
Higher order functions overall add quite the cognitive load. I find humans aren't good at thinking in such higher orders. I think when designing functions that uses them, you must be sure that their use is either self-contained, or encapsulated behind another simpler interface.
Someone should coin a pasta for that kind of code... maybe Cannelloni code, code that makes use of too much higher order functions.
There's already Spaghetti for code that's too interweaved, Lasagna for code that has too many useless layers, Ravioli for code that has too many small functions. So I think Cannelloni could work for code that gets too clever with higher order functions 😋
"flour code" -- code that doesn't produce pasta. It only eventually becomes pasta when combined with other code later.
functions returning functions
seems bit of a trivial case (and a design that is necessary for e.g. the middleware pattern), but I see what you mean
the usual suspect would be something like chaining juxt
and comp
...
I think I would appreciate a detailed/nuanced essay on the topic. I have seen articles suggesting HOFs are outright wrong. Quite obviously that's not going to cut it among functional programmers :)
In this case its essentially a partial where one of the arguments is expensive to calculate
Middleware and Interceptor are both use of them that I consider "encapsulated behind an interface". Same for transducers etc.
Like they're very well contained use with rules of how to use them and do they don't leak anywhere else
But I've seen code that for example, should be using a multi-method or polymorphism, but instead it uses HOF.
Mixed with some config file where you can put the fn name you want and some dynamic calls to resolve 😋
a lot stuff written as multimethods or protocols would be better off passing a function
if I want to run two copies of some code and have them behave differently, it is easy to pass in a different function to each copy, it is harder to get that working when it relies on global state
it happens a ton where you create something that does X, business requirements shift so now you need X and Y, and the thing that does X could also do Y if it was just run with different parameters
I find I prefer my functions to be global, have a name, a doc-string, have been tested, etc. And unlike state, they don't suffer from coordination issues.
I mean, passing in a function to another function is just creating your own custom dispatch
No, the function that takes the fn as an argument is the dispatch fn in that case. It takes an fn and choses when to call
I've never had clashes with protocol extensions or multi-method extensions. But I can see how it could happen
anyway, I write protocols, I write multimethods, but nothing has given me the code reuse and interoperability that passing functions does
I think you probably just have the intuition to know when HOF are appropriate and when they would make things more complicated then they are worth.
generally I don't write them as external extension points, so they aren't like the "interface" for something, internally they do dispatch on input data to do whatever
Its definitely not a problem I had thought of before. That two things could extend the same protocol to a type for the same fn or the same multi-method for the same data
I think my issue with using fn instead is their loose definitions. Like if you wanted to know what are all the possible fns being passed in? Say you were reading a new code base, with HOF, that's a lot harder to figure and then understand the possible branches of logic.
Even Protocols and multimethods are open systems. You won't (and can't) know all implementations of the abstraction. Pattern matching is more appropriate in places where you "know" beforehand that the possible set of impls is closed.
Most of the times, when I think hard enough about it, most things I thought were closed are really open, by essence.
And generally. In a given code base, people will put all their multi-method extensions in the same place
But I guess with discipline, that problem could be eliminated. Like if the passed in HOF are all defined with defns in a common place, with doc and tests.
Passing around lambda function might be hard to debug though.