Fork me on GitHub

Is there a way to transform:

(with-open [in (io/reader f)]
  (vec (csv/read-csv in)))
Into something reducible, like:
(into []
      (map first)
      (reducible-csv-rows f))
Library: [org.clojure/data.csv "1.0.0"]

Alex Miller (Clojure team)01:12:03

The result of read-csv is a lazy seq iirc, which is reducible

Alex Miller (Clojure team)01:12:39

The vector is also reducible of course, just not necessary


is "reducible" a reference to the reducers library, or a shorthand for something I've forgotten?


(oh I see it mentioned but not actually defined in the page about reducers


I took reducible naming from See lines-reducible Reducible here means it reifies IReduceInit. reducible-csv-rows should, when reduced, open a file, yield parsed rows on demand and close it. Something similar to next.jdbc/plan


OK, that definitely does not exist


you can reduce / transduce on a lazy-seq, but there is a small overhead that IReduceInit could hypothetically eliminate


you would end up using some internal functions from I guess


perhaps wrapping the row parsing into a transducer?


as it's currently designed, that's not abstracted and you always get a lazy-seq


but the amount of work do the refactor / performance benefit might not be worth it(?)


It's not about performance, I have to wrap everything with with-open. It's ok... but I prefer api like lines-reducible from the blog, instead of with-open + line-seq.


right, when the data is presented as a lazy-seq, you can't make it self-cleaning like that


I had a similar struggle with a tag-soup xml parser, where I just wanted to read until I found some payload and then move on, but the underlying implementation, because it used a lazy-seq abstraction plus being self-cleaning of resources, ended up being a huge memory leak. It only discarded inputs and streams when you read them to the end, and all partially read streams stayed alive in a gc root owned by a cleanup thread


either with-open or non-lazy IReduceInit would have saved me a lot of headache


Using fdef is a bit inconvenient with named arguments, because there's no easy spec for declaring a map of some keys, s/keys forces me to s/def evey value again, which is pretty annoying for fdef with named argument. Is there a way around that?


Its like I'd need a s/keys that lets me do: (s/keys :req-un [:arg1 string? :arg2 ::some/thing])


So I can do:

(s/fdef fn
  :args (s/cat :args (s/keys :req-un [:arg1 string? :arg2 ::some/thing])))

Alex Miller (Clojure team)04:12:01

Do you mean like kwargs? Use s/keys* as a regex op for that


Hum, maybe, I'll look at it


Ok, no I don't think so, that still requires me to define spec for the keys


@didibus keys* is what you need for named arguments but you'll still need to write a spec for each of them. You want something like what's in Spec 2 I think...


...Spec 2 allows for unqualified keys with inline predicates.


(although of course Spec 2 is not ready for use yet 🙂 )


And I gather the fdef part of Spec 2 is going to get a rewrite once Rich figures out what it should look like...


Ya, inline predicate is what I need 😄, defining a whole spec just for some small arg from a function is too much


I guess I'll wait for Spec 2 😖

Ben Sless07:12:22

Hey all, I'm getting some weird behavior with core.match and debugging - I was toying around with a port of and tried adding let bindings to the interpreter It should work, but when I try to evaluate even a basic expression I get "no matching clause for ,,," error Even weirder, when I instrument the definition (with CIDER's debugger) it does work What's going on? Code for reference:

(defn eval-expr
  [expr env]
   (x :guard number?) x
   (x :guard symbol?) (env x)
   (['zero? x] :seq) (zero? (eval-expr x env))
   (['inc x] :seq) (inc (eval-expr x env))
   (['dec x] :seq) (dec (eval-expr x env))
   (['* x y] :seq) (* (eval-expr x env) (eval-expr y env))
   (['if pred then else] :seq) (if (eval-expr pred env)
                                 (eval-expr then env)
                                 (eval-expr else env))
   (['fn [x] body] :seq) (fn [arg] (eval-expr body (fn [y] (if (= x y) arg (env y)))))
   (['let [sym expr] body] :seq) (eval-expr `((~'fn [~sym] ~body) ~expr) env)
   ([rator rand] :seq) ((eval-expr rator env) (eval-expr rand env))))

(def environment (fn [y] (throw (ex-info "oops"))))

 '(let [x 1]


anyone else has been having this problem ?

Error building classpath. Could not transfer artifact org.clojure:spec.alpha:jar:0.2.176 from/to central (): status code: 416,
reason phrase: Range Not Satisfiable (416)
with default latest clojure on archlinux installed using the package manager


with this in ~/.clojure/deps.edn I got clojure working again

:deps {
      org.clojure/clojure {:mvn/version "1.10.1"}
      org.clojure/core.async {:mvn/version "1.3.610"}
      org.clojure/spec.alpha {:mvn/version "0.2.187"}
      org.clojure/core.specs.alpha {:mvn/version "0.2.44"}

Alex Miller (Clojure team)09:12:40

what version is that? clj -Sdescribe

Alex Miller (Clojure team)09:12:10

that error sounds like a partial/broken download of something

Alex Miller (Clojure team)09:12:22

or the maven repo being cranky

Alex Miller (Clojure team)09:12:22

probably worth doing rm -rf ~/.m2/repository/org/clojure/spec.alpha and then clj -Sforce with your original deps.edn


is it possible to make the rlwrapped clj to support auto completion?


rebel starts too slow


basically i want a clj repl with a better tab completion support but without dragging down the start up time too much.


the rlwrap config will add things to the completion dictionary if the repl prints them or you type them in, so things like clojure.repl/dir can help


there's probably some clever way to get the repl to print a bunch of things on startup just to put them in the completion dictionary for the current repl...


(ins)user=> ;(s/uni<tab> ; no completion available
(ins)user=> (doseq [n (ns-publics 'clojure.set)] (println (str "s/" (key n))))
(ins)user=> (s/uni
s/uni<tab>  s/union     
(ins)user=> ; (s/union ; now tab completes


my scenario is that, I don’t know what functions a namespace provides. I use the repl for hints.


right, in my code I didn't need to know either


it iterates the namespace and creates the completions


but readline (because of the way it is used) is limited to analyzing what the repl prints to you and what you type in, for anything smarter I think you need an integrated tool


(for your use case, instead of just iterating the namespace, it could also iterate the ns-aliases)


Another qualified keyword question 🙂 Let's say you're wrapping an api service called "foobar" that streams event messages as maps to you with keys like :time and :amount. You have a client function that creates a client for this service that takes some settings that you expose which aren't' necessarily terms service itself uses, such as :sandbox? and :creds. I'm thinking I should qualify both these setting keys as well as the keys in the event maps themselves. Do you you use the same qualifier for both, ie :foobar/sandbox? as a setting as well as :foobar/time in the event message? Or do you somehow differentiate between terms I make up and terms that are just exactly taken from their own names?


@i Develop a workflow where you only (re)start a REPL very rarely? 🙂 I have my REPLs running for days, sometimes weeks...


Ran into an interesting corner today while writing a custom spec conformer, I learned that when using certain macros like if-let or when-let, one of the branches cannot be the keyword literal :clojure.spec.alpha/invalid

(when-let [foo true] :clojure.spec.alpha/invalid)

Syntax error macroexpanding clojure.core/when-let at (...).
:clojure.spec.alpha/invalid - failed: any? at: [:body]

Alex Miller (Clojure team)22:12:17

yeah, that's a known issue and probably not something we're going to fix


yeah, I figured as much

Alex Miller (Clojure team)22:12:35

easy work around is to (def invalid ::s/invalid) then use that


for sure


not an issue that's gonna stop me from using spec! that's for sure :-)

Alex Miller (Clojure team)22:12:26

also, you probably shouldn't use custom spec conformers :)


oh? I would be interested to hear more about why not

Alex Miller (Clojure team)22:12:44

conformers are intended to be used primarily for making new spec types, not for serving as a data meat grinder to conform/coerce your way to a particular value


> data meat grinder That's so metal! \m/ >.< \m/

metal 3

when you say new spec types, do you mean the act of def'ing a new spec? like so (s/def ::foo (s/conform #(...))?

Alex Miller (Clojure team)22:12:53

no, I mean making new kinds of specs

Alex Miller (Clojure team)22:12:06

for example, conformers were used to make s/keys*

Alex Miller (Clojure team)22:12:04

(I'm assuming you're talking about s/conformer but maybe I misunderstood)


I'm following, a bit. Granted I've only delved so deep into spec, surely not at the level you have, so I guess what I would like to understand is in what way is that not just a layer of indirection/abstraction?


it's very rare that a user would need to use the :clojure.spec.alpha/invalid keyword, I think.


ah yes, indeed, clojure.spec.alpha/conformer not conform

Alex Miller (Clojure team)22:12:28

"conforming" is about parsing data and explaining why choices were made in the parsing

Alex Miller (Clojure team)22:12:37

not about coercing data into a particular shape

Alex Miller (Clojure team)22:12:36

you can to some degree use it for the latter, but eventually you will start to lose s/unform reversibility, automatic generation, etc

Alex Miller (Clojure team)22:12:56

so we recommend you not do that. if you want to transform data, use clojure functions to transform it

Alex Miller (Clojure team)22:12:16

there's a reasonable chance spec 2 will not include s/conformer


gotcha, gotcha... the way I'm finding it useful, however unidiomatic it may be, is to create specs for REST API params, and conforming them in the handler, checking to see that they are valid or not, and then passing the data (now type coerced) into the core functions of the application


spec has seemed to me like a tool which fits that job nicely, particularly considering in non-production environments, returning explain strings along w/ a 400 can be useful when working on distributed systems w/ distributed ownership (and ownership of documentation :-P)

Alex Miller (Clojure team)23:12:16

if you transform during conforming, you have thrown away information about the original value

borkdude23:12:21 I've heard good things about #malli for this kind of purpose (validating and coercing in one go)


true, but in the context of validating inputs of http requests, the type is always a string and will have to change at some point for some applications


it's not a replacement for spec, they serve a different kind of niche I believe


I read that first article on malli when it dropped a couple months back, am interested, and am planning on digging in when I have to time to


I'm curious to see where each of the approaches go in the long term


btw, I have found a nice use for the conformed output and then processing it with another pattern matching library (plucking values out of the conformed data structure) here:

borkdude23:12:15 This may also be interesting to look at:




thanks for the interesting discussion guys


and the links