This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2019-05-31
Channels
- # aleph (1)
- # announcements (2)
- # beginners (20)
- # calva (44)
- # cider (60)
- # clj-kondo (6)
- # clojure (27)
- # clojure-dev (2)
- # clojure-europe (8)
- # clojure-italy (18)
- # clojure-mexico (5)
- # clojure-nl (61)
- # clojure-spec (12)
- # clojure-uk (101)
- # clojurescript (82)
- # cursive (2)
- # data-science (21)
- # datomic (24)
- # fulcro (19)
- # graalvm (5)
- # hoplon (11)
- # jobs-discuss (35)
- # juxt (7)
- # keechma (6)
- # off-topic (21)
- # pedestal (5)
- # planck (2)
- # qa (43)
- # re-frame (3)
- # reagent (7)
- # reitit (4)
- # rewrite-clj (12)
- # sql (10)
- # testing (4)
- # tools-deps (6)
- # vim (23)
- # xtdb (3)
Hi, is there a something equivalent to GOPROXY in maven for fetching dependencies. In our CI whenever it tries to build a jar it fetches the dependencies every time from Clojars.
It sounds like using docker without keeping your .m2 cache between builds, so persist your m2 between builds
But to build up m2 it will still fetch it from clojars right,
Anyone got any preferences for picking things out of parsed xml these days? I like the look of the approach in https://juxt.pro/blog/posts/xpath-in-transducers.html but there's no library there or anything, just some notes - wondering if maybe there's something out there now
Anyone here have experience with Clj on Windows through PowerShell versus Windows Subsystem for Linux?
I want to read a long file, processing the lines, without keeping all of it in the memory. Is there a nice way to ensure the resource is closed once the whole line-seq is consumed? I guess I could wrap the processing in a transducer as they have the 1-arg "completion" arity... Thoughts? Thanks!
this might help, its from data csv but it might apply to whatever you are doing https://github.com/clojure/data.csv#laziness
Thank you. The proposed solution - move with-open up - is usable but not as nice :)
@U0522TWDA I've had success with things like
(defn csv-maps
"generic low overhead way to turn (massive) csv file
into lazily evaluated stream of maps"
[ff & opts]
(reify clojure.lang.IReduceInit
(reduce [_ f init]
(with-open [^BufferedReader r (jio/reader ff)]
(let [hdr (first (apply csv/read-csv (.readLine r) opts))]
(transduce (map (partial zipmap hdr))
(completing f)
init
(apply csv/read-csv r opts)))))))
you can reduce
over that and you get maps of the row data
Thank you! Does it close properly when there is an exception during the processing?
yeah the with-open
should take care of that
it expands to a
(try
...
(finally (.close
Thanks!!
I don't think it is, I have never seen it done, but I believe that is supported behavior