Fork me on GitHub
#clojure
<
2017-07-04
>
neeasade00:07:16

hi all -- I'm using tools.cli and it's not clear how to pass multiple arguments to an option -- eg if I want an option to become a vector of numbers

neeasade00:07:56

nevermind -- ended up taking it as a string and doing this - :parse-fn (fn [input ](into [] (map #(Integer/parseInt (str %)) input)))]

sophiago03:07:43

This is rather odd, but has anyone experienced an issue where dereferencing agents of collections truncates some of the elements?

noisesmith03:07:51

remember that send and send-off can return before their action finishes - or even before it starts sometimes

noisesmith03:07:52

+user=> (let [a (agent []) f #(send a conj nil) g (fn [] (count @a))] (dotimes [_ 10000] (f)) (repeatedly 10 g))
(6622 6687 6716 6735 6757 6776 6794 6825 6851 6880)

sophiago03:07:11

@noisesmith that's got to be it. I think I just got confused because I'm creating and returning the agent from inside a macro. That's where I see the trouble. If I create a def for the return value and then deref that it waits just long enough for the last results

noisesmith03:07:30

there's the function await

sophiago03:07:11

Thanks, Justin! 🙂

sophiago03:07:09

Oh...actually I didn't realize await depends on the :done key. I'm using an int-map so no keywords...

noisesmith04:07:22

there's no magic :done key, await just lines up a lock that won't unlock until every pending action is done, then waits on it

(defn await
  "Blocks the current thread (indefinitely!) until all actions
  dispatched thus far, from this thread or agent, to the agent(s) have
  occurred.  Will block on failed agents.  Will never return if
  a failed agent is restarted with :clear-actions true."
  {:added "1.0"
   :static true}
  [& agents]
  (io! "await in transaction"
    (when *agent*
      (throw (new Exception "Can't await in agent action")))
    (let [latch (new java.util.concurrent.CountDownLatch (count agents))
          count-down (fn [agent] (. latch (countDown)) agent)]
      (doseq [agent agents]
        (send agent count-down))
      (. latch (await)))))

sophiago04:07:35

Oh, ok. I was confused by Clojure Docs: https://clojuredocs.org/clojure.core/await

sophiago04:07:44

I'll have to see why it's not still working

noisesmith04:07:49

one possibility is that something is calling send after you call await?

sophiago04:07:00

I would assume

sophiago04:07:36

Here's the macro:

sophiago04:07:16

I think await isn't doing anything since it can't know whether pdiff has finished

noisesmith04:07:01

it does do something - it waits on all actions that have been submitted before it is called - but you need something different if you want to wait on things pdiff might initiate after the point await is called

noisesmith04:07:17

also calling def on the output of gensym is super weird -why not a let block?

sophiago04:07:02

I'm confused as to what you mean in your second statement

noisesmith04:07:34

def always creates namespace level bindings, using def with a gensym is weird because how would you even know which thing to look for later?

noisesmith04:07:53

let is for locals, so I would expect let instead in a scenario like this

sophiago04:07:53

Oh, the gensym isn't quasiquoted

noisesmith04:07:11

that has nothing to do with it

noisesmith04:07:18

def creates a new var in your namespace

noisesmith04:07:40

oh - I see, you create the gensym def once

noisesmith04:07:04

that guarantees that pdiff-once is a race condition if it's used in two threads though

weavejester04:07:14

What’s the purpose of the macro?

noisesmith04:07:20

that's another good question

sophiago04:07:04

The purpose of the macro is to automate three calls in one: creating the agent, calling pdiff, and returning the contents of the agent

noisesmith04:07:36

if two threads use that macro, the second one will replace the data used by the first

noisesmith04:07:46

it's extremely unsafe

weavejester04:07:55

Okay, but why not:

(defn pdiff-once [poly order]
  (let [tape (agent (i/int-map))]
    (pdiff poly tape order)
    (await tape)
    @tape))

fenton04:07:00

why this:

router=> (if-let [{params :params} {}] (str "params:" params ".") "NOT-defined")
"params:."

weavejester04:07:43

@fenton because {} is not falsey.

sophiago04:07:01

weavejester: let me try that. I'm not sure it'll work, but I'm relatively new to agents.

weavejester04:07:10

(if-let [params (:params {})] ...)

fenton04:07:26

@weavejester : thanks 🙂

weavejester04:07:53

With if-let, the expression on the right needs to be nil or false to fail

fenton04:07:56

trying to hit else if params not defined

weavejester04:07:47

@sophiago You could also use a promise.

weavejester04:07:09

It depends what pdiff looks like, but a promise is more usual.

noisesmith04:07:18

I assumed an agent was being used because there were multiple alterations

noisesmith04:07:37

but doing that in a new thread makes using an agent problematic...

weavejester04:07:57

The name pdiff-once also suggests a memoize, but it depends what you’re trying to do.

sophiago04:07:58

But await is not working

weavejester04:07:24

Could you give us an idea of what pdiff is doing, @sophiago ?

sophiago04:07:32

send calls inside pdiff have not finished by the time pdiff-once returns

sophiago04:07:57

it's just associng values with send

noisesmith04:07:02

if await returns, that means those calls weren't even made before await was

weavejester04:07:25

pdiff is parallel diff I assume.

noisesmith04:07:27

you need some other way to know pdiff is done

sophiago04:07:48

This is exactly what I said previously...

noisesmith04:07:30

you said "await isn't working" which I might have misinterpreted

sophiago04:07:55

I said, "I think await isn't doing anything since it can't know whether pdiff has finished"

weavejester04:07:58

Would it be possible to post the definition of pdiff as well?

sophiago04:07:15

It's not going to make a difference

noisesmith04:07:59

it could, if there was a way to ensure you don't return from pdiff until it makes all it's send calls for example

weavejester04:07:11

I’m not clear on what pdiff is doing, or why you’re using an agent. I assume it’s doing something in parallel.

jgeraert07:07:54

@sophiago do you get your functionality working without all the parallel stuff? Just plain idiomatic single-threaded clojure

erwinrooijakkers08:07:32

Does anyone know how to obtain the Set-Cookie header from http-kit, when you pass the Cookie header in the call as well? E.g.,

(defn visit-url [{:keys [cookies url] :as context}]
  (let [result-chan  (chan)
        check-result (fn [{:keys [status] :as response}]
                       ;; TODO: get new cookies here....:/ not visible in response
                       (log/error "RESPONSE" response) ; => no `Set-Cookie`
                       (go (>! result-chan (= status 200))))]
    (http/get url
              {:headers          {"Accept" "text/html"
                                  "Cookie" cookies} ; cookies is string of earlier obtained cookies
               :follow-redirects false}
               check-result)
    result-chan))

elise_huard13:07:18

question: how to find a dependency a leiningen plugin pulls in at runtime? I've run lein deps :tree on the most obvious dependencies, and I don't see where a particular version of ring is coming from. I've sprinkled :exclusions all round. Yet I find the dependency in target/stale/leiningen.core.classpath.extract-native-dependencies ...

elise_huard13:07:39

(context: trying to upgrade gorilla repl to 1.9, but it barfs on an old version or ring)

elise_huard13:07:15

happens when I do lein gorilla :port 9000, yet neither gorilla-repl nor lein-gorilla seem to need it

pesterhazy13:07:10

have you tried deleting target?

elise_huard13:07:10

I've deleted all the content yes

pesterhazy13:07:56

lein deps :tree should work, unless gorilla-repl does something funky

pesterhazy13:07:22

you can issue a global exclusion for ring

elise_huard13:07:08

hm. the problem being that lein-gorilla does need ring, just not that old one ...

pesterhazy13:07:18

that's ok, you can specify the latest version

pesterhazy13:07:36

exclusion just means "ignore whatever transitive dependencies there are relating to this"

elise_huard13:07:00

thank you, will try that 🙂

noisesmith13:07:04

to look at plugin deps there's a separate plugin tree command lein deps :plugin-tree

elise_huard13:07:43

@noisesmith nice one, that should help

a1313:07:23

lein ancient
can help with outdated deps (it's a separate plugin though)

elise_huard13:07:17

well, it looks like it's something funky alright

elise_huard13:07:45

(as in it still doesn't show up)

micahasmith13:07:09

question-- does anyone have a homoiconic datetime solution/approach that they use in their projs?

mpenet13:07:09

I like my dates served as longs, nothing else (I rarely have to care about TZ)

pesterhazy13:07:42

I use only java.util.Date and js/Date

micahasmith13:07:58

just everything UTC -> epoch @mpenet ?

pesterhazy13:07:00

with the occasional goog.time and clj-time for formatting

mpenet13:07:21

date as number works well as long as you don't need to be super precise with "date math"

mpenet13:07:59

otherwise java 8 api seems decent

micahasmith13:07:39

not a fan of #object[java.time.LocalDate ...]

micahasmith13:07:14

seems strange those don’t get read

noisesmith13:07:33

that would require clojure to require java 8 right?

noisesmith13:07:41

I guess it could do that conditionally?

micahasmith13:07:55

would be interesting to have those dump into a constructor/factory format like (java.time.LocalDate. & args)

noisesmith13:07:57

the more likely option would be what clojure does today with java.util.Date - using a literal representation that the reader accepts

micahasmith13:07:28

i did not know that

noisesmith13:07:26

+user=> (java.util.Date.)
#inst "2017-07-04T13:25:33.376-00:00"
+user=> #inst "1492-01-10T12:11:44.000-00:00"
#inst "1492-01-10T12:11:44.000-00:00"

noisesmith13:07:10

Date is a less than great API, but clojure makes it readable

noisesmith13:07:23

no, it's how Date objects are printed, its the instant reader

elise_huard13:07:32

I confirm the fishiness, was pulling a dependency in the code facepalm

elise_huard13:07:40

so that was fun

micahasmith13:07:34

this is cool. didnt know about #inst and #uuid and stuff

plins14:07:41

hello everyone, i have 4 heavy database queries that are running sequentially

(benefit-db/transition-benefits-to-ongoing db/db-spec)
      (benefit-db/transition-benefits-to-consumed db/db-spec)
      (benefit-db/transition-benefits-to-ended db/db-spec)
is there an easy way to run this guys in parallel and do something else when they all finish?

erwinrooijakkers15:07:15

plins: (let [results (map deref [(future trans1) (future trans2) (future trans3)] do-something-else))

triss15:07:25

hey all… for a number of strange reasons I want to create an instance of a class whose name I only have as a string. How would I instantiate one?

mpenet15:07:10

either via a macro or reflection possibly

triss15:07:05

cheers @mpenet. I just found Class/forName which does what I’m after

mpenet15:07:16

(.newInstance (Class/forName "foo.bar.baz"))

mpenet15:07:21

yes 🙂

mccraigmccraig15:07:21

tho (.newInstance (Class/forName "foo.bar.baz")) only seems to have a 0-args sig

mccraigmccraig15:07:19

ah, you have to use .getDeclaredConstructor on the Class object to get the constructor method and then call that

lspector17:07:45

Do multiple threads just reading from (derefing) an atom that has a constant value block or cause any kind of contention?

lspector17:07:39

Awesome -- so should be the same as just reading a var?

lspector17:07:13

in terms of multithread contention

hiredman17:07:57

that is kind of complicated, they are different things. my intuition would be that vars would be ever so slightly faster before the jit has kicked in, and an atom would be ever so slightly faster after

hiredman17:07:38

but they are different things, not drop in replacements for each other

lspector17:07:10

understood. the story is that we have a system with config data in atoms, which are constant after configuration, and our multicore scaling is bad. the question came up of whether billions of accesses to constant-valued atoms might be causing contention. the alternative we're considering isn't actually to use a var (actually, we'd get rid of the vars that currently hold the atoms), but just to pass all of the config data as an argument throughout the system. it'll be a relatively big job to try this, so i'm trying to figure out if the underlying theory is even true, that reading (derefing) the atom in a var can cause contention among threads.

noisesmith17:07:41

get on an AtomicReference is the same overhead as reading a volatile, which sources claim is cheap

noisesmith17:07:49

this should also be easy to microbenchmark though

hiredman17:07:23

you should actually hook something like visualvm and look at the hot methods

hiredman17:07:29

don't just guess

hiredman17:07:41

if your configs are in global atoms, then you are dereferencing both the var containing the atom and the atom

noisesmith17:07:43

that's a good point, based on profiling I've done I'd expect the var lookup to be more expensive than the atom deref

hiredman17:07:21

I mean for the config thing

hiredman17:07:02

for vars vs. atoms, without the jit, my guess is the atoms still have to traverse a few extra pointers for access, but with the jit I expect it is a regular reference that just uses atomic instructions

hiredman17:07:22

(I don't know, I haven't profiled it or looked at the code the jit produces)

hiredman17:07:58

but I would expect just about anything else to dominate those differences

dm317:07:46

there should also be some improvement with direct linking on

hiredman17:07:00

direct linking is only for functions

noisesmith17:07:12

fwiw the differences are pretty small once the JIT has kicked in

+user=> (let [a (atom 42)] (reduce (fn [_ ft] @ft) (repeatedly 10000 #(future @a))))
42
+user=> (let [a (atom 42)] (crit/bench (reduce (fn [_ ft] @ft) (repeatedly 10000 #(future @a)))))
Evaluation count : 900 in 60 samples of 15 calls.
             Execution time mean : 91.621587 ms
    Execution time std-deviation : 11.931053 ms
   Execution time lower quantile : 57.841222 ms ( 2.5%)
   Execution time upper quantile : 106.129733 ms (97.5%)
                   Overhead used : 1.534327 ns

Found 4 outliers in 60 samples (6.6667 %)
        low-severe       4 (6.6667 %)
 Variance from outliers : 80.6424 % Variance is severely inflated by outliers
nil
+user=> (def a (atom 42))
#'user/a
:user=> (crit/bench (reduce (fn [_ ft] @ft) (repeatedly 10000 #(future @a))))
Evaluation count : 600 in 60 samples of 10 calls.
             Execution time mean : 97.601813 ms
    Execution time std-deviation : 11.193445 ms
   Execution time lower quantile : 73.574920 ms ( 2.5%)
   Execution time upper quantile : 116.281827 ms (97.5%)
                   Overhead used : 1.534327 ns

Found 1 outliers in 60 samples (1.6667 %)
        low-severe       1 (1.6667 %)
 Variance from outliers : 75.5029 % Variance is severely inflated by outliers
nil
+user=> (crit/bench (reduce (fn [_ ft] @ft) (repeatedly 10000 #(future *clojure-version*))))
Evaluation count : 780 in 60 samples of 13 calls.
             Execution time mean : 99.271467 ms
    Execution time std-deviation : 8.236914 ms
   Execution time lower quantile : 78.021114 ms ( 2.5%)
   Execution time upper quantile : 111.492377 ms (97.5%)
                   Overhead used : 1.534327 ns

Found 4 outliers in 60 samples (6.6667 %)
        low-severe       1 (1.6667 %)
        low-mild         3 (5.0000 %)
 Variance from outliers : 61.8161 % Variance is severely inflated by outliers
nil

noisesmith17:07:51

that's comparing - direct access to atom and deref, access of atom in var and deref, and access to a value in a var

noisesmith17:07:28

the futures are because there was concern about contention

hiredman17:07:06

obviously what you should do is preprocess your config in to a primitive array, and generate macros for accessing each config value that expand in to agets of the right array slot

lspector18:07:07

all very interesting! point well taken about checking data before optimizing, and we have a lot of work to do on that front

lspector18:07:10

it seems from the discussion here this use of atoms in vars probably isn't producing contention, right?

noisesmith18:07:06

right- no evidence of contention there at all

lspector18:07:15

fwiw the alternative we're considering would be function args, rather than vars, with the config passed everywhere

lspector18:07:27

awesome, thx!!

noisesmith18:07:13

I realize I didn't benchmark just having a data literal - checking that now to see if there's any difference worth noticing

noisesmith18:07:31

but really profiling your own code and looking for the actual perf bottlenecks is the way to go

noisesmith18:07:25

yeah - replacing @a with a literal 42 (but still reducing over futures etc.) ends up taking the same (actually slightly longer, but within the measurement epsilon so not meaningful)

lspector18:07:33

great to know -- thanks!

hmaurer20:07:31

Hello! I am using Pedestal + Lacinia Pedestal to develop a GraphQL server, but I am not quite sure how to set up hot-reloading. Could anyone point me in the right direction?

hmaurer20:07:45

I would like to hot-reload the GraphQL schema and the resolvers