Fork me on GitHub
#clojure
<
2017-09-09
>
wei03:09:41

anyone have an easy way to convert a org.json.JSONObject to a map?

wei04:09:09

my solution:

(defn jsonobject->edn [jsobj]
  (cheshire.core/parse-string (.toString jsobj) false))

seancorfield04:09:40

That's not really converting to EDN (which is a text format) -- it's converting to "Clojure" data structures, so maybe json-object->clj?

seancorfield04:09:57

Or perhaps read-json-object?

seancorfield04:09:31

(which is more in keeping with clojure.edn/read-string for example)

wei04:09:56

true! thanks for the feedback, I renamed it to your suggestion in my code

tagore04:09:33

Names matter.

hmaurer10:09:53

And naming is hard… I tend to spend more time figuring out a name for a function than coding the function itself 😄

sundarj16:09:57

@hmaurer hammock-driven development!

tagore04:09:03

I've been going over a bit of older code for the company I work for and...

tagore04:09:41

The same names are re-used for different levels of converting data.

tagore04:09:46

It's confusing.

didibus18:09:13

Names matter, but you'll always get them wrong. So its good to practice reading things as if names are just random unique ids, instead of documentation.

mbjarland19:09:41

is it ok/idiomatic to use :keys destructuring in protocol method definitions as in:

(defprotocol IMyP
  (method [this & {:keys [foo bar] :or {foo false bar true}}]))

tbaldridge19:09:39

Have you tried that?

tbaldridge19:09:27

@mbjarland I don't think that code does what you think.

mbjarland08:09:26

@U07TDTQNL Tried it yes, but as you say, it does not seem to do what I would expect. So assuming I want this kind off optional parameters, am I forced to just add a ton of arities to the protocol method declarations?

oskarkv21:09:12

Is the result of an eduction different from calling seq on that eduction other than the fact that the applications will be performed every time reduce/iterator is called? I'm trying to figure out if it's safe to treat an eduction as a sequence or if it will bite me later somehow.

oskarkv21:09:10

And what is the point of returning an eduction instead of a lazy seq? I don't really get it. 😛

hmaurer21:09:41

Should I be concerned when composing a large number of functions (potentially hundreds)? In a very much non-performance-critical setting

seancorfield23:09:34

@hmaurer If it isn't performance critical, why would you be concerned? And even if it was, try it and see whether it's fast enough.

seancorfield23:09:59

What's the problem domain for which the solution seems to be massive composition?

hmaurer23:09:23

@seancorfield call stack depth, etc? I am not familiar with those things on the JVM

noisesmith23:09:49

it's configurable, but you can compose functions without increasing the call stack

hmaurer23:09:02

Basically I am generating a schema and found it nice to express as a composition of functions each doing a little bit of work

noisesmith23:09:02

(eg. comp or trampoline)

hmaurer23:09:19

I could use something like Pedestal for a flat interceptor chain though…

noisesmith23:09:40

that is yet another way to compose functions without making the call stack deeper, yeah

hmaurer23:09:40

good to know, thanks @noisesmith !

Lambda/Sierra23:09:55

@hmaurer If you comp thousands of functions, you will eventually reach the maximum allowed depth for a Java call stack.

Lambda/Sierra23:09:17

You could, however, write an "interpreter" that processes a list of functions without increasing the call stack.

noisesmith23:09:25

oh, right, comp does actually make the stack go deeper, right

hmaurer23:09:50

@stuartsierra thanks! however I am not quite sure I understand why comp (given that code) makes the call stack deeper; would you mind to explain?

hmaurer23:09:16

beyond the obvious one-level deeper by calling comp

noisesmith23:09:18

each call is wrapped in the ones before it in the arg list

tagore23:09:48

It kind of has to to do what it does ( I mean you could do theoretically do that sort of thing on the heap, but it's hard to see why you'd need to.)

hmaurer23:09:57

oooh, because of reduce? So it does not make it deeper for the <= 2 args case, but does make it deeper for > 2?

tagore23:09:01

Why are you composing thousands of functions?

noisesmith23:09:10

but you could definitely write a "trampolining comp" version that didn't do that (trading worse performance for not getting stack overflows for huge arg lists)

noisesmith23:09:28

@hmaurer even with two args, that's two stack frames

tagore23:09:57

I'm not saying there could never be a good reason to do that, but...

tagore23:09:11

I have yet to run into this problem.

hmaurer23:09:23

if there was an arity-3 case, like this:

([f g h] (fn [x] (f (g (h x)))))
then it would also only generate two stackframes, right @noisesmith ?

hmaurer23:09:38

but obviously you wouldn’t want to write this for every arity, thus the trampoline

noisesmith23:09:09

that would be 3 - one for f, g, h each

hmaurer23:09:13

But (h x) will be called and will return, the g will be called and return, then f will be called and return, right? that’s only one level at ach point

noisesmith23:09:42

when h is called, that's inside a call to g, which is inside a call to f

hmaurer23:09:21

@noisesmith how is it inside the call to g? the result of h is passed as an argument

hmaurer23:09:40

@noisesmith is argument evaluation lazy in clojure?

noisesmith23:09:58

you can see by throwing an exception, one moment, generating an example in my repl

hmaurer23:09:14

boot.user=> (defn foo [x] (println "-> foo"))
#'boot.user/foo
boot.user=> (foo (println "abc"))
abc
-> foo
nil

hmaurer23:09:31

is that what you meant? here the argument seems to be evaluated before the call

hmaurer23:09:09

or are they evaluated “just as you get into the foo stackframe”?

noisesmith23:09:15

you are right about (f (g (h x))) not needing to go three deep, my bad

noisesmith23:09:07

but comp still makes things go deeper because of the way it handles varargs with self calls that generate new functions (I think? or was stuart wrong?)

hmaurer23:09:07

@noisesmith 🎉 what code did you run to check that?

noisesmith23:09:30

+user=> (let [f inc g inc h inc i inc j inc k inc l inc m /] (f (g (h (i (j (k (l (m 1 0)))))))))
ArithmeticException Divide by zero  clojure.lang.Numbers.divide (Numbers.java:158)
+user=> (pst)
ArithmeticException Divide by zero
        clojure.lang.Numbers.divide (Numbers.java:158)
        clojure.core// (core.clj:1019)
        clojure.core// (core.clj:1012)
        user/eval21 (NO_SOURCE_FILE:9)
        user/eval21 (NO_SOURCE_FILE:9)
        clojure.lang.Compiler.eval (Compiler.java:6978)
        clojure.lang.Compiler.eval (Compiler.java:6941)
        clojure.core/eval (core.clj:3187)
        clojure.core/eval (core.clj:3183)
        clojure.main/repl/read-eval-print--9945/fn--9948 (main.clj:242)
        clojure.main/repl/read-eval-print--9945 (main.clj:242)
        clojure.main/repl/fn--9954 (main.clj:260)
nil

hmaurer23:09:37

@noisesmith yes I think that’s right; comp uses reduce after 2 args so that will generate extra stackframes, right?

noisesmith23:09:53

I think so - checking now

hmaurer23:09:16

oh wait, actually it might not…

noisesmith23:09:05

+user=> (let [f inc g inc h inc i inc j inc k inc l inc m /] ((comp f g h i j k l m) 1 0))
ArithmeticException Divide by zero  clojure.lang.Numbers.divide (Numbers.java:158)
+user=> (pst)
ArithmeticException Divide by zero
        clojure.lang.Numbers.divide (Numbers.java:158)
        clojure.core// (core.clj:1019)
        clojure.core// (core.clj:1012)
        clojure.core/comp/fn--6808 (core.clj:2543)
        user/eval30 (NO_SOURCE_FILE:13)
        user/eval30 (NO_SOURCE_FILE:13)
        clojure.lang.Compiler.eval (Compiler.java:6978)
        clojure.lang.Compiler.eval (Compiler.java:6941)
        clojure.core/eval (core.clj:3187)
        clojure.core/eval (core.clj:3183)
        clojure.main/repl/read-eval-print--9945/fn--9948 (main.clj:242)
        clojure.main/repl/read-eval-print--9945 (main.clj:242)
nil

noisesmith23:09:39

now I'm back to my original thought that comp avoids creating more stack frames, but I would be surprised if @stuartsierra was wrong about this

hmaurer23:09:53

@noisesmith

boot.user=> ((apply comp (map (fn [_] inc) (range 10000))) 1)
10001
boot.user=> ((apply comp (map (fn [_] inc) (range 100000))) 1)

boot.user=> java.lang.StackOverflowError:

noisesmith23:09:01

btw you can just use (repeat 100000 inc) for that, but yeah, that definitely does stack overflow

hmaurer23:09:23

@tagore yeah I am sure it’s not desirable; I am new to clojure and had to write a sequence of transformations on some map. I started using comp and kept going with it

tagore23:09:59

OK, well what are you trying to do?

seancorfield23:09:15

@tagore Scroll up -- @hmaurer already explained that...

tagore23:09:43

I'm inclined to suspect that composition is not the tool you're looking for here.

hmaurer23:09:00

TL;DR; I am aware comp probably isn’t the wisest approach, but by using it for this purpose it made me wonder how far I could go with it before blowing up the JVM 😛

hmaurer23:09:51

@stuartsierra’s mention of writing an interpreter for a list of functions seems good (which, unless I am mistaken, is what Pedestal is doing with interceptors)

seancorfield23:09:59

@hmaurer as hinted in one of the responses earlier, you could create a list/vector of the functions you want to compose and then reduce over that, calling each one on the accumulating value...

tagore23:09:07

@seancorfield I've read what he wrote, but I'm still not clear about what he wants. I can be a bit slow, I'm afraid.

noisesmith23:09:17

(defn trampocomp [& fns]                                                        
  (fn [x]                                                                       
    (reduce #(%2 %)                                                             
            x                                                                   
            fns))) 

hmaurer23:09:44

@tagore my bad, it was poorly explained!

seancorfield23:09:45

As usual @noisesmith is a faster typist than I am 🙂

noisesmith23:09:21

@seancorfield I learned to type by being a MUD addict

seancorfield23:09:05

My MUD days are so far behind me now that I've become a slower typist. That's my story, and I'm sticking to it! 🙂

tagore23:09:29

@hmaurer Not at all.. as I said, I can be a bit slow to understand what people want as an outcome

tagore23:09:06

OTOH, I can be very persistent about it

hmaurer23:09:04

I think I got the answer(s) I was looking for, but I’ll add a bit of context here in case you have further advice: I am generating a GraphQL schema from an application schema (database schema + some extra info). I am doing this as a sequence of transformations on a “schema” map. So I start with an empty map, then generate some object types, then generate some queries, etc

hmaurer23:09:05

I wrote each transformation as a function that takes two config maps and extra options, and return a 1-arity function which takes a schema and returns a schema

hmaurer23:09:08

and I am composing those

hmaurer23:09:56

The reason I was expecting to reach high number of compositions (in the low hundreds) was because some of my transformation steps should be executed for each entity type

hmaurer23:09:15

So I have code like this

(apply comp
         (map #(gen-for-node hodl-config config %)
              (keys (:nodes hodl-config)))))
(gen-for-node returns a (fn [schema] ... transformed-schema))

hmaurer23:09:34

I’ll stop there to avoid spamming this channel with unecessary details

tagore23:09:36

OK, I've never written a graphql api, so grain of salt, but...

hmaurer23:09:19

the fact that it’s a GraphQL schema is irrelevant tbh, I just mentioned it to give context

hmaurer23:09:59

the essence is, I think, building up a map as a sequence of (parametarized) transformations

tagore23:09:06

I've never felt the need to compose functions that deeply, nor have I seen anyone else need to.

tagore23:09:17

So I'm suspicious of the approach.

tagore23:09:51

Would a set of mutually recursive functions that built a result (much like a parser) do for you?

tagore23:09:24

You might still have to trampoline, because stack frames, but....