Fork me on GitHub
#clojure
<
2016-07-29
>
danielcompton04:07:28

Is there a public alternative to binding-conveyor-fn? I’m wanting to capture all the bindings into another thread, in the same way future does, without using future (because I want to control the thread it runs on)

hiredman05:07:33

bound-fn will create a function that will have the bindings from the scope the function is created in in scope when the function is invoked

urbanslug09:07:23

Yo dudes, why do I need do because as far as I can see even side effecty functions will run in order

urbanslug09:07:49

(def func []
  (first-func)
  (second-func))

urbanslug09:07:36

Compare with

(defn func []
  (do
     (first-func)
     (second-func)))

urbanslug09:07:53

Screw slack markdown btw

urbanslug09:07:01

Shift+return should send instead

urbanslug09:07:12

Like twitter

urbanslug09:07:38

Oh I see the answer from the docs. Because I’m doing it in a function ";; fn (`defn` by extension) and let have an implicit do"

pesterhazy09:07:40

there's a mistake in your snippets: it should be defn, not def

pesterhazy09:07:51

@urbanslug: most language constructs and macros in clojure have implicit do, the main exception being if

joakimmohn12:07:48

What would a forward slash as the first element in a form mean, kinda like this: (def thumb-size 150) (/ thumb-size (blah blah) (blah blah))

yrgl12:07:37

it is simple division

yrgl12:07:31

something like this (/ 1 2 3)

cyppan12:07:41

Hi guys I'm stuck reading big files. The files are about 2 ~ 5G and my script always break with OutOfMemoryError Requested array size exceeds VM limit java.util.Arrays.copyOf (Arrays.java:3332) I read each line, parse and map it and push it synchronously in a core async channel. Any idea whats happening?

(with-open [rdr ( file)]
      (doseq [line (line-seq rdr)
              :let [event (try
                            (-> (parse-string line true)
                                (dissoc :page :referrer :ua :geo)
                                (assoc :collection (keyword collection))
                                (update :timestamp long))
                            (catch JsonParseException _ nil)
                            (catch Exception ex (error ex) nil))]
              :when event]
        (try
          (when-let [enriched (enrich-event event)]
            (a/>!! input-channel enriched))
          (catch Exception ex (error ex)))))

gfredericks12:07:36

@cyppan: does the channel have an unbounded buffer?

cyppan12:07:20

no I want to control the ingestion (def input-channel (a/chan (a/buffer 1) (partition-all 500)))

dignati12:07:30

@cyppan: Maybe the doseq holds on to the head of the line-seq?

cyppan12:07:51

the whole code is there...

cyppan12:07:14

probably but I can't see where

dignati12:07:26

Ah no, it doesn't. Just thinking out loud here

gfredericks12:07:01

what's the stack trace?

cyppan12:07:16

events are flushed to a database and should regulate the file ingestion speed

cyppan12:07:17

a kind of back pressure

mpenet12:07:04

the async/chan is then a bit useless no? (if it's sized it'll just allow an initial burst to fill to capacity then go at same pace as db IO), asuming your db calls are blocking and you dont parallelise them you could just do it in the doseq

mpenet12:07:25

but anyway, not sure it's the cause of the issue

cyppan12:07:38

yes good point

gfredericks12:07:40

cyppan: based on the stack trace I'm wondering if your file is accidentally one giant line?

gfredericks12:07:02

it looks like line-seq is failing to return

cyppan12:07:17

gonna check that

cyppan13:07:17

@gfredericks I have some quite big lines indeed (5000 chars at least)

gfredericks13:07:09

5000 isn't likely larger than the maximum array size

cyppan13:07:31

using awk (trying...) to find it out

cyppan13:07:14

thanks for helping by the way at least I have a track!

gfredericks14:07:55

you could truncate the file to a few megabytes and confirm that it does what you'd expect in that case

michaeldrogalis16:07:08

Can one instruct Leiningen to only use one JVM across all :prep-task tasks? I think it's booting up a new one each time.

gfredericks16:07:13

does anybody use clj-refactor and know how to keep it from analyzing every *.clj file it can find?

gfredericks16:07:29

I'd be happy with being able to specify subdirectories (`["src" "test"]`)

didibus17:07:24

I was under the impression you could call a function in a different namespace by using the fully qualified name, without the need to require it. But it doesn't seem to work? Must you always require everything ?

arohner17:07:26

didibus: that only works if another namespace has already required it. Also, generally not recommended

juhoteperi17:07:01

@michaeldrogalis: (most) lein plugins have their own deps so they need separate classpaths and lein solves this by launching separate jvms

juhoteperi17:07:46

Or if the plugin needs to run with project classpath it needs separate jvm

michaeldrogalis17:07:14

@juhoteperi: Makes sense. I think I was in the weeds, anyway.

didibus17:07:48

@arohner I see, so the vars need to be interned, and I guess there's a global intern map where the keys are fully qualified. Is there another way to quickly intern and call at the same time? Mostly for REPL use case, where I don't want to bother doing a full on require?

arohner17:07:38

@didibus: the code has to be loaded before you can call it

didibus17:07:06

@arohner: Oh, I see what you mean. If it's loaded it would work without a require?

didibus17:07:38

@arohner: How is the code loaded when running without a repl? Is it all loaded at once based on the class path ? Or does it load it as it encounters them through a require?

arohner17:07:03

loaded through require

arohner17:07:09

(or very rarely, load)

didibus17:07:25

@arohner: ok thanks, I understand now why it's a bad idea. REPL could actually trick you into thinking it works, which is probably what happened to me and where I had the vague memory this was possible.

ccann18:07:59

this is cool:

(let [bisect-by (juxt filter remove)]
  (bisect-by odd? [1 2 3 4 5]))
[(1 3 5) (2 4)]

shader18:07:26

what's the right way to join the results of a 'for into a single vector? Basically, flatten one layer

shader18:07:08

[[a (p :a)] [b (p :b)]] -> [a (p :a) b (p :b)]

spei18:07:05

#(apply concat %)
i think should work

jr18:07:43

consider using mapcat as well

shader18:07:12

are there versions that return vectors or preserve collection type? Or is the best solution (apply vector ...)?

jr18:07:40

vec will transform a seq into vector

jr18:07:14

mapv will accumulate a vector but not flatten

mpenet19:07:06

You can abuse into + a mapcat transducer

mpenet19:07:51

(into [] (mapcat your-fn) coll)

zane19:07:49

Seems like more use than abuse. 😄

lvh19:07:33

Did anything get added in 1.9.0 to make :keys destructuring of maps with ns’d keys easier?

lvh19:07:35

Looks like ::k works, but that may not be new

anmonteiro19:07:37

@lvh:

boot.user=> (let [{:a/keys [b c]} {:a/b 1 :a/c 2}] b)
1

lvh19:07:44

ah, gotcha — I’m looking for auto-nsd keys though

lvh19:07:49

that looks like ::keys might work

lvh19:07:14

(let [{::keys [k]} {::k 1}] k) ;; => 1

borkdude21:07:10

never mind, I misread nano/micro-seconds

wei23:07:49

I’m trying to walk a nested structure and add a sequential index to all the maps. any ideas / strategies? e.g. [[{} {}] [{} {} [{}]]] => [[{:id 1} {:id 2}] [{:id 3} {:id 4} [{:id 5}]]]

lvh23:07:02

wei: walk with a side effecty counter, or a zipper could both do that

lvh23:07:10

do you care in particular about which one gets which id?

lvh23:07:25

sorry, I just noticed that you have vecs

lvh23:07:30

yeah, walk can do that.

lvh23:07:43

you could do it purely functionally too but it’ll probably be uglier.