Fork me on GitHub
#beginners
<
2019-08-12
>
gagan.chohan06:08:01

Hi, so If I want to change the value of some value of :keys in let , how can i achieve it

gagan.chohan06:08:04

(def client {:namae "Super Co."
      :location "Philadelphia"
                               :description "The worldwide leader in plastic tableware."})

gagan.chohan06:08:15

(let [{:keys [namae location description]} :as hello client])

gagan.chohan06:08:30

How can I edit the value of namae in let statement

jumar06:08:11

@ you cannot "change" the value, you can only override it by binding it to something else. Let bindings are effectively constants

(let [{:keys [name ,,,]} my-data
       name (str name " - edited")]
  ,,, )

gagan.chohan06:08:05

so overriding or changing, both serve the same purpose. Right

jaihindhreddy06:08:59

You can also have defaults when destructuring associatively.

(let [{:keys [a b] :or {b 42}} {:a 1}] [a b])
; => [1 42]

retrogradeorbit07:08:02

also remember you can destructure without :keys

user> (let [{name :namae :keys [location description] :as hello} client]
        [name location description])
["Super Co."
 "Philadelphia"
 "The worldwide leader in plastic tableware."]

retrogradeorbit07:08:29

but be careful. name is a clojure.core function. It is rebound in this let

gagan.chohan07:08:29

why this doesn't work

gagan.chohan07:08:35

(let [{:keys [namae location description] :as hello} client
namae "hello"]
(println namae location "-" description))

gagan.chohan07:08:16

but Thanks, I get your point

jumar07:08:37

@ it works for me:

(let [{:keys [namae location description] :as hello} {:namae "Juraj" :location "EU" :description "<empty>"}
namae "hello"]
(println namae location "-" description))
;; => 
hello EU - <empty>
nil

electricladyland15607:08:59

Anybody have tips and tricks on how to deal with nested spec that has same key but different spec predicate?

jumar07:08:22

I guess you're talking about unqualified keys and the same key appearing at different "levels" of a map? In that case you could just use different namespaces for the key specs:

(s/def :a/my-key int?)
(s/def :b/my-key boolean?)
(s/def ::b (s/keys :req-un [:b/my-key]))
(s/def ::nested (s/keys :req-un [:a/my-key ::b]))
(s/valid? ::nested {:my-key 10 :b {:my-key true}})
;; => true
(s/valid? ::nested {:my-key false :b {:my-key true}})
;; => false

electricladyland15607:08:43

Damn, I feel like a donkey, thank you!

godwin.ko07:08:30

sorry to bother, if I have a vector [100 200 300] and I need to minus 400 from it so that to return [0 0 200] or minus 200 to return [0 100 300], how can I do so ?

electricladyland15607:08:27

@godwin.ko (defn subtract-every [n x] (map #(- % n) x)))

godwin.ko07:08:00

(map #(- % 400) [300 200 100]) => (-300 -200 -100)

godwin.ko07:08:53

what I want is accumulated subtract instead of every…

electricladyland15607:08:08

Sorry I misread the question

jumar07:08:43

@godwin.ko it's hard to read it right but you mean to subtract x from every element y where x <= y and the total sum of all decrements from all the elements is the number you get as an input?

godwin.ko07:08:21

almost but not exactly, sorry for misleading you 🙏:skin-tone-2:

andy.fingerhut07:08:14

Did you mean that if you give it [100 200 300] and you minus 400 you should return [0 0 200] ? That is subtract 100 from first number, since 400 is bigger than 100, then subtract 200 from next number because you still want to subtract a total of 400 but have only subtracted 100 so far, etc.?

godwin.ko07:08:21

@andy.fingerhut ops, you’re right, my bad, typo

andy.fingerhut07:08:44

Do you always want a vector to be returned, or would a lazy sequence be better in any use case you have in mind, because the 'consumer' of the return value might in some case only need the first few elements?

jumar07:08:50

Perhaps use reduce? - this is the best I could quickly come up with?

(first (reduce (fn sub [[acc x] y]
                                    (let [decrement (min x y)
                                          new-y (- y decrement)]
                                      [(conj acc new-y)
                                       (- x decrement)])) 
                                  [[] 200]
                                  [100 200 300]))
;;=> [0 100 300]

leonoel08:08:07

(defn cascading-subtract [x [y & ys]]
  (let [z (- y x)]
    (if (neg? z)
      (cons 0 (lazy-seq (cascading-subtract (- z) ys)))
      (cons z ys))))

godwin.ko08:08:08

both works, thx a lot @jumar & @leonoel

borkdude10:08:22

it might be fun to practice a few clojure fns while shell-scripting: https://github.com/borkdude/babashka

chase-lambert16:08:42

an exercise I just did had me create my own filter function and I came up with:

(defn my-filter                                                           
  [pred coll]                                                             
  (loop [pred pred                                                        
         coll coll                                                        
         new-coll []]                                                     
    (if (empty? coll)                                                     
      (seq new-coll)                                                      
      (if (pred (first coll))                                             
        (recur pred (rest coll) (conj new-coll (first coll)))             
        (recur pred (rest coll) new-coll)))))
Is it bad form to have an if statement nested as the else branch of another if statement? I'm wondering how else I could have terminated this because without that empty? check I was getting some errors when executing. I feel like I should be using reduce somehow to do all this but haven't figured it out yet.

toby92416:08:10

You could use cond in place of the nested if

toby92416:08:55

(cond
  (empty? coll) (seq new-coll)
  (pred (first coll)) (recur pred (rest coll) (conj new-coll (first coll)))
  :else (recur pred (rest coll) new-coll))

chase-lambert16:08:55

But when I think about it for me reduce means I'm taking a collection and taking it down to one value so maybe it's not appropriate since here I'm forming a whole new collection. Would you agree or am I thinking about that wrong?

toby92416:08:39

If you think of the 'one value' as another collection then it works just fine.

toby92416:08:00

So you could reduce one vector into another vector with fewer entries

chase-lambert16:08:16

that's what I was just concluding too. I just haven't trained my brain to see that pattern and approach yet with reduce. I'll need to track down more examples of that in usage.

chase-lambert16:08:14

having the (fn..)'s within another function starts being a little difficult for me to keep track in my head still

toby92416:08:27

Actually most collection functions can be implemented using a reduce, since it gives you complete flexibility of the 'shape' of the thing you end up with after the iteration.

chase-lambert16:08:29

loop/recur seems to come more naturally for my understanding

toby92416:08:56

A reduce could yield a larger collection even

chase-lambert16:08:20

I would love to see a good example of this being the right approach if you can think of something.

toby92416:08:55

(defn my-filter [pred coll]
  (reduce
    (fn [acc item]
      (if (pred item)
        (conj acc item)
        acc))
    (empty coll)
    coll))

toby92416:08:02

In this case (empty coll) gives you an empty instance of whatever collection coll is

chase-lambert16:08:20

i like this! this helps a lot, thanks.

toby92416:08:21

The function to reduce takes an accumulator and each item

toby92416:08:51

And if the item should be removed, just returns the current acc

toby92416:08:05

Otherwise adds the item to acc

chase-lambert16:08:22

I think what tripped me up is that I could keep using pred in that inner function without explicitly saying it's an argument to the inner function if that makes sense. that was tripping me up when I was trying to do this.

chase-lambert16:08:27

so the inner functions have access to all the parameters passed in to the original function

toby92416:08:30

Yeah, you can rely on the lexical scope of the outer function so that you effectively 'close over' it's arguments and they remain accessible

toby92416:08:59

This is a 'closure'

toby92416:08:09

(Hence Clojure :))

chase-lambert16:08:49

haha, good stuff. and I'm definitely drinking the kool-aid because I was like hmmm, Hence isn't a function

toby92416:08:21

You can implement reduce using loop/recur - the reason the reduce version is more succinct is because it's a higher level of abstraction than loop/recur and so requires less work.

chase-lambert16:08:18

and with your reduce version you don't have to explicitly check for an empty list right because reduce knows to end at that point but in mine I have to explicitly check or it doesn't terminate? how would you put that in a clearer explanation?

toby92416:08:13

Yes, you're right. reduce already does the work of checking if coll is empty and just returns it in that case.

toby92416:08:36

Actually, it returns the initial value passed to reduce but that's equivalent in this case.

chase-lambert16:08:29

awesome stuff. thanks a lot!

toby92416:08:30

So (reduce (fn [acc item] ...) "Wat!" []) => "Wat!"

toby92416:08:41

No problem 🙂

chase-lambert16:08:00

so say you have an initial hunch to use loop/recur like I often do in these situations. Is there some kind of rule of thumb where you are like "hmm, no this is a better use of reduce"?

chase-lambert16:08:26

I'm struggling how to ask this.

chase-lambert16:08:17

or do you always just default to reduce and then think maybe it's a loop/recur thing. yeah, I haven't thought this question through yet

chase-lambert16:08:51

and you don't have to even use first and rest and stuff because it's kind of built in to what reduce is doing working through the collection. You just have (pred item) (conj acc item). It's so damned elegant. I love it!

seancorfield17:08:07

I think, after a certain amount of just using Clojure for stuff, you just start to reach for reduce first and don't think of loop/`recur`...

toby92417:08:01

Well as I said, most of the other collection manipulation functions are built on top of something like reduce but provide an even higher level of abstraction. So I personally don't use reduce directly as much, instead I try and find another function that achieves my goal first.

toby92417:08:17

But as @ says, it's rare to use loop/`recur` for me these days unless implementing new recursive algorithms. The other day, I used it for a custom graph traversal in loom (https://github.com/aysylu/loom)

seancorfield17:08:31

It's like transducers. At work we're pretty leading edge in terms of Clojure uptake -- we take alpha releases to production all the time so we can start to leverage new features earlier. But we didn't really start using transducers "naturally" until fairly recently (like, the last year or two) even tho' they've been available since 2015.

toby92417:08:39

If you can't find any appropriate collection manipulation functions you can always rely on reduce so for me the hierarchy is: specific function > reduce > loop/`recur`.

chase-lambert17:08:45

this is great. I feel like getting an intuitive understanding of these things can really take me to the next level. I haven't wrapped my head around transducers yet.

toby92417:08:04

@ nor have I 😄

toby92417:08:54

Tim Baldridge has put together some nice videos on them though, when you want to learn more: https://tbaldridge.pivotshare.com/categories/transducers/2426/media

chase-lambert17:08:14

I might have to resubscribe to those. I had stepped away from clojure for a bit (to my regret) but now I'm back with a vengeance.

chase-lambert17:08:06

when exploring the clojure cheatsheet for inspiration I often times see "if not provided with a collection, this returns a transducer" so I've been thinking of them as lazy transformation functions if that makes any sense. hahaha

toby92417:08:40

Yeah, that's how I'm seeing them too - they are composable transformations

seancorfield17:08:24

It may help to think of them as separating the "processing" from the source and sink, especially when looking at into: (into sink process source) -- process is some data transformation that can be applied in other contexts such as on a channel -- it's not tied to collections.

seancorfield17:08:29

(and if you find yourself using ->> to create pipelines of transformation, maybe using transducers would be more efficient)

chase-lambert17:08:38

interesting because I absolutely love ->> but I am definitely looking forward to taking my understanding into these higher abstractions.

chase-lambert17:08:29

i don't seem to develop using ->> as much as I like to refactor into it after "solving" the problem first using normal structural editing/repl development. I find it easier to understand when reading other people's code (and my old code)

kevin.zemon18:08:43

So, the code base I'm working in has a bunch of defmacros that seem to be used identically to functions, what's everyones rule of thumb for when to create a macro instead of a function?

noisesmith18:08:08

I use them only if: 1) the correctness of code requires evaluation while compiling and not later or 2) the explicit goal of the macro is to provide a new syntax

bherrmann21:08:46

(= java.lang.String java.lang.String)
(case "a"
  "a" "fruit"
 )
(case java.lang.String
  java.lang.String "fruit"
 )
1. Unhandled java.lang.IllegalArgumentException
   No matching clause: class java.lang.String

seancorfield21:08:35

I think case catches everyone out at least once...

christian.gonzalez21:08:05

I opted for multimethods when checking class, you can also use :default to make what is essentially a case

bherrmann00:08:54

I dont feel enlightened, but condp for the win

user> (condp = java.lang.String
  java.lang.String "fruit"
)
"fruit"

csd21:08:52

I'm getting a reflection warning about unresolved constructor from

(HashMap. {"foo" "bar"})
Adding a hint like
(HashMap. ^Map {"foo" "bar"})
doesn't seem to address it. Is there anything instead that I should do?

hiredman21:08:41

hints on literal collections tend to get eaten in the compiler

hiredman21:08:51

although, actually, even without the hint in that case you shouldn't get a reflection warning

seancorfield21:08:38

I can't repro the reflection warning on that either. Maybe the real code has the map in a var @csd?

csd21:08:04

here's the actual snippet

(defn put-scalar-into-map [value]
  (if (instance? Map value)
    value
    (HashMap. {"value" value})))

csd21:08:55

call to java.util.HashMap ctor can't be resolved.

seancorfield21:08:16

(defn put-scalar-into-map [value]
  (if (instance? Map value)
    value
    (HashMap. ^Map (hash-map "value" value))))
I think @hiredman might be right about the type hint getting swallowed on the literal.

hiredman21:08:52

ah, yeah, the literal that isn't a literal case

ghadi21:08:40

Clojure maps are already Java Maps

neo255121:08:55

Are there any place where we can practice the use of core.async? Some repository with exercises? I more or less understand the concept, but the library has a lot of functionality beyond the basics and it would be great if there was a place where to learn them.

christian.gonzalez21:08:40

@neo2551 this is a pretty helpful exercise to follow https://www.braveclojure.com/core-async/

vachichng21:08:19

hi all, any recomendation for a deep data oriented design book?

neo255121:08:30

@christian.gonzalez thanks a lot for the website I love it :). My issue is it does not go into all the pipelines of core.async

christian.gonzalez21:08:12

Yeah this is pretty introductory, I’m sure there are other resources out there depending on what you’re trying to achieve