Fork me on GitHub

Still working on AOC's day 23. When my terminating condition is met, if I just return the value without closing the channels, it returns fine. If I try to close the channels with doseq, the function finishes executing (based on println s) but waits for something and doesn't quit. Does anyone have any tips for debugging in core.async?


if you use clojure's internal threadpools you need to run shutdown-agents to get timely shutdown at exit


don't run that until you are ready to shut down the vm though


otherwise the vm doesn't exit until the threadpools reclaim any cached threads


Oops I meant quit as in function exiting. I'm running a let expression on the repl in emacs, that let expression doesn't finish running.


then something happens after your last println and the function is still running


I added more println s and it seems like all the channels closed, the function is still running after the last println


with jstack (comes with the jdk) you can get the stack traces of all running code in the vm


outside emacs you can use Control-\ to get the same result


I have no idea how to get that info via emacs


(let [from-router (vec (repeatedly 50 #(a/chan 10)))
        to-router (a/chan 100)]

    ;; Boot all computers
    (doseq [[addr c] (zipmap (range) from-router)]
      (intcode-computer addr c to-router))

    ;; Do routing
     (a/go-loop []
       (let [[addr x y] (a/<! to-router)]
           (= addr 255)
             (println "received value at 255")
             (doseq [c from-router] (a/close! c))
             (println "closed from-router")
             (a/close! to-router)
             (println "closed to-router")
             ;; function continues running after the last println

             (println "routed" [x y] "to" addr)
             (let [c (nth from-router addr)]
               (a/>! c x)
               (a/>! c y))


thanks, let me look up jstack


Control - \ in what context?


also, adding doseq at the end of your function means it will return nil (not sure if that could confuse things for you)


when running clojure in a terminal, Control-\ makes java dump all running stack traces


so you know which code is running


of course emacs isn't a terminal, and has its own way of interpreting Control-\, and wont' pass that to the jvm


There's a tiny y at the end of the function!


sure, that means it's returning y from the cond block


y, without parens to invoke some action, can't make anything hang


and this should make your let block return y


err, make it return (<!! y)


so that might be your issue - y not having any data ready


intcode-computer is actually a go-loop that reads from the channels in from-router if the state machine requires input and writes to to-router


In this case y is the third element of the vector, right? Does destructuring work after <!?


yes, destructuring isn't effected by the action that fetches the data it uses, only the data fetched


the problem here has to be that nothing wrote to y


so <!! doesn't return


I have a <!! Outside the go-loop, in this case


right, that's what's blocking


try taking it out


I'll try it in a bit


Removing the doseq let's the function exit nicely though


perhaps somehow closing those channels prevents y from blocking providing a return value


I'd need to read your code a lot more closesly (and probably see all the async code) to really know the answer to that though


Is destructuring lazy? Maybe it tries to read only when y needs to be output?


no, destructuring is just a macro that expands to a bunch of let bindings


Oh but reading from a closed channel doesn't block unless there's nothing in the channel


reading from a closed channel doesn't block - you get buffered contents, or nil


if y isn't closed, but you prevented it from receiving data, reading from it will block


oh yup thanks


without the <!! outside the go-loop , it appears to exit immediately but actually it's running in the background - it just returns a channel


which is the right behavior i guess

Alper Cugun09:12:09

In the loop docs there are a bunch of examples but is it documented anywhere what the binding syntax is? Like here you can use commas? (loop [n (bigint 5), accumulator 1]

Alper Cugun09:12:38

I’m especially interested in usage of :as there but can’t find anything about it.

Alper Cugun20:12:13

Cool. I’ve asked for those to be linked up in the relevant places.

Bobbi Towers09:12:49

commas are always whitespace

Bobbi Towers09:12:57

loop uses the same binding syntax as let

Alper Cugun09:12:59

Because I followed the trail to the binding doc but that seems to be something else again.


it's literally the same syntax and rules as let, and mostly the same as function args - this is intentional


Also, comma is less common. I guess with time you will also start seeing comma as unnecessary. I have nothing against commas, I just don't see it much in day to day code.


Hi all, what are some good system/service design resources in Clojure?


Thanks for the book suggestion. My office provides subscription to Oreilly. Will check there.

👍 4

A caveat on Clojure Applied: Alex has said that if he was writing the book today, it would use records a lot less. So bear in mind when reading it that you should probably use plain old hash maps in many places where the book encourages records.


Hi Sean, do you also know what changed between the time of writing the book and now?


Best practices in Clojure evolve over time 🙂


That book was written 4-5 years ago. Even with Clojure's stability as a language, that's still a long time in terms of best practices.


I think Alex said one of the main differences in "best practices" is around Spec but I can't find his exact comment right now.


April 25th, 2019, in #clojure-spec "Slack: alexmiller: fwiw, I would probably de-emphasize records in a 2nd ed of Clojure Applied" -- still looking for reasoning but that was in response to a comment about modeling the domain using records instead of maps.


Thanks and if I may ask one more question. Is there anything like the AOSA book for Clojure? It need not be a book. Mayne articles, case studies.


Since that's an architecture book, I'd expect quite a bit of it to apply to Clojure (although I suspect a heavy OOP slant in that book? I've never read it, nor even heard about it before now)


Maybe search for "domain-driven design clojure" and see what turns up -- since Clojure is about focusing on the data in your domain.

👍 4

(and you may ask as many questions as you want! 🙂 )


I have found this book: Alas, it's not free.


It started out as a series of blog posts and those are free. Also note that the book is only half complete and last updated three and a half years ago so...


Given that it uses top-level defs with side-effecting code (reading a config file, building a Component system), I would be very cautious about treating it as any sort of "best practice"...


Thanks for the input. Appreciate it.


Why is it faster accessing values in a record than a map?


A map is a map. A record is closer to a Java class and the getters are actually offsets into the array like data store of the class.


ahh ok, Thanks for the answer. I was trying to locate this in the source code


Hey, but I am not a full 100% sure. Let me dig up more info.


Okay, let me rephrase that. defrecord creates an actual Java class under the hood.


But still you want to stick with hash maps in nearly all cases. Records are great when you a) always have a small fixed set of keys that you know will always be present and b) more importantly, you want fast dispatch by type, i.e., polymorphism. So don't think of records as "fast maps" -- there are a bunch of trade offs around the choice.


(map #(Integer/parseInt %) ["1" "2" "3"])
evaulates to (1 2 3) but
(-> ["1" "2" "3"] (map #(Integer/parseInt %)))
Error printing return value (IllegalArgumentException) at clojure.lang.RT/seqFrom (
Don't know how to create ISeq from: aoc.core$eval1810$fn__1811
Why is this? :thinking_face:

Chris O’Donnell17:12:01

In the second case your arguments are in the wrong order.

Chris O’Donnell17:12:35

You probably want to use ->> instead of ->


to be explicit, with the thread first, you end up with (map coll function) rather than (map function coll). The error is saying it doesn't know how to create a seq from a function.


since your function is where the collection normally goes, map is trying to get it into a seq so it can map over it and has no idea how to turn #(Integer/parseInt %) into a seq


Oh, I see! Thread-first vs Thread-last. Thank you!


I have defined this vector:

(def accounts (ref
			[{:uuid_account "745286b0-24d3-4b17-ab24-d1265e9fb8d1" :identification_id "33333333333" :name "Account1" :transactions [{:amount 1000.00M :created_at "2019-12-27"}]}
				{:uuid_account "234de110-ec07-4568-884c-8aad330c24eb" :identification_id "44444444444" :name "Account2" :transactions [{:amount 1500.00M :created_at "2019-12-26"}]}
				{:uuid_account "e1255330-0f63-42cd-b7c8-acde1915f885" :identification_id "55555555555" :name "Account3" :transactions [{:amount 2000.00M :created_at "2019-12-25"}]}]))
I want to include a new transaction inside the third account. What would be the best way to do this? Using clojure.walkperhaps?


Don't use a vector, use a map


sorry @U0NCTKEV8 but I didn’t understant very well. My ref is constantly changing cause user can include new accounts and transactions inside the accounts. Even though could I use like your example?


I forget the exact syntax because I never use refs but with a map instead of a vector for accounts you would do something like (dosync (alter! accounts update-in [account-id :transactions] conj new-tx)) to add a new transaction


Also, I dunno what the rest of you code is doing, but if you only have one mutable reference you can just use an atom, refs are only useful when you need to coordinate changes to multiple mutable references at once, which most people don't end up doing


And what I said about normalization, you may not think of it those way, but what you are creating is an in memory database, so all the database techniques and ideas (indexing, normalization, etc) apply

👍 4

In your case you have a vector which is equivalent to an auto incrementing key and that is the only index


So you can only look things up quickly based on the position in the vector


But the shape of your data suggestions you need to look things up by account id, which a map will let you do


Normalization in database terms is basically the process of pulling apart and flattening your data model until all of it can be easily indexed for efficient retrieval


Like, what if you just have a transaction id and want to find the account for it


With your model you need to scan all transactions in all accounts to find it (full table scan in DB land)


In a more normalized schema transactions wouldn't be part of accounts, but would be their own thing with there own index on their ids


Then to make looking up the transactions for account fast each tx is also indexed by the account they belong to


I see… so I could actually have something like this so it would be easier to look into the data I want?

(def accounts (ref {"745286b0-24d3-4b17-ab24-d1265e9fb8d1" {:identification_id "33333333333" :name "Account1" :transactions [{:amount 1000.00M :created_at "2019-12-27"}]}
                    "234de110-ec07-4568-884c-8aad330c24eb" {:identification_id "44444444444" :name "Account2" :transactions [{:amount 1500.00M :created_at "2019-12-26"}]}
                    "e1255330-0f63-42cd-b7c8-acde1915f885" {:identification_id "55555555555" :name "Account3" :transactions [{:amount 2000.00M :created_at "2019-12-25"}]}}))


Yes, you can also duplicate the uuid if you want it returned on lookup:

(def accounts (ref {"745286b0-24d3-4b17-ab24-d1265e9fb8d1" {:uuid_account "745286b0-24d3-4b17-ab24-d1265e9fb8d1":identification_id "33333333333" :name "Account1" :transactions [{:amount 1000.00M :created_at "2019-12-27"}]}}))


Also, while you should most likely use a map, if you insited on using a vector, you can also update it the same way:

(dosync (alter accounts update-in [2 :transactions] conj new-tx))
Where you just give update-in the index into the vector (zero based), so for the third element, it is index 2. This works because vectors are associative, and so you can use most map functions on them given indices as keys.


thanks @U0K064KQV and @U0NCTKEV8 I changed to use maps and now it works perfectly parrot


{accountid accountinfo}


Then continue to normalize and build yourself a little in memory database use clojure.set/index so you can lookup by account if or transaction I'd, or anything else

Mario C.18:12:28

Is it possible to combine a map-filter process? As in filter for certain criteria and if it meets said criteria then transform the value into this. Instead of filtering and then mapping.


(into orig-coll (map fn (filter ffn orig-coll)))


I'm assuming you mean you want to keep the values that don't meet the filter criteria


if you specifically want to avoid the intermediate lazy sequences. then transducers can do that. but that's more an optimisation than anything different

Mario C.18:12:44

I always thought that the map and filter, how it is written in your example is considered two walks


Technically, yes, it will create extra intermediate lazy sequences, but whether that matters in practice is a different question.


it's lazy. so it's sort of two walks. if you want to make it "definitely only a single walk" you can compose transducers like (into [] (comp (filter ffn) (map fn)) orig-coll)


Is your code performance/memory sensitive? i.e., do you already know that map over filter over coll is too slow/uses too much memory?

Mario C.19:12:26

This part of the code needs to be performant as much as I can get it to be

Mario C.19:12:50

its not dire


imo I would measure the difference with real data with a real use case. but transducers are in general going to be faster


No, it is not two walks


It gets a bit complicated, but the overhead isn't in having to do more looping, but in having to create intermediate objects.


And that overhead is further reduced by the use of chunking


So in practice, lazy-seqs will often be just as fast


Which is why people say you should measure it, because sometimes the non lazy-seq version is actually slower


Think of it as a pull model


You want the first element that matches the criteria, and you want it transformed


So it will start looping on the collection until it finds the first element that matches the filter predicate. When it finds it, it will stop looping, and return the found element and apply the map function to it. Done


When you then ask for the second element, it will resume looping where it last left off


But to do this "resuming", you need to keep track of additional data, that tracking is the overhead which makes lazy-seq potentially slower


To reduce the amount of tracking required, Clojure pulls in chunks of 32 at a time. So it only needs to track what is left every 32 elements.


That even when you ask for the first element only, it will actually pull in the first 32 elements, thus it will loop over the first 32 elements, and then stop there, and it will remember that it must now loop at element 33 next.


Those first 32 elements will be cached. So if you ask for the second element, it is already available in the cache, an no more looping, filtering or mapping was needed.


Only once you ask for an element greater than 32 will it resume the loop, and filter and map another chunk, etc.


This snippet explains it. This creates an unchunked infinite sequence of 0 to infinity, so 0 -> 1 -> 2 -> 3 -> 4 ... The sequence will print the index it is currently iterating over as f: i Now I take the first, and you see it prints f: 0. If I take the second, it prints f: 2. And when I take the third f: 3, etc. See how it didn't have to loop over 1 and 2 again? It just continued from where we were last. Even though I am doing a filter followed by a map. The filter and the map happen one after the other per element. Each element is only visited once.


Look at this snippet as well, might help you understand. See how when using the eager filterv and mapv it first filters everything, which is one loop over 4 elements, and then it maps everything, which is another loop over the 4 elements returned by filterv. But when using the lazy filter and map, it grabs the first element and then filter and map it, and then move to the next element. Thus it is a single pass. While using the transducer element, it similarly did a single pass, and filters and maps element by element, but it all happened before the call to a because it is eager? The only difference thus with the lazy-seq is that the transducer didn't ever have to remember where we were and didn't have to create an intermediate checkpoint if you want after each iteration, where the lazy-seq did, it had to create a new lazy-seq of the remainder after each iteration.

Mario C.17:12:08

@U0K064KQV Thank you for this! It actually completely changed my understanding (lack thereof) of lazy-seq's! Just copied and pasted this onto my bear notes 😄


😄, glad I could help


Yes, you can use mapcat, but map and filter are lazy, so calling one after another combines into both at once when you walk the result

Mario C.18:12:02

so when they are lazy they are combined into a single walk?


If you don't care about laziness, using transducers will give you a single pass: (into [] (comp (filter ,,,) (map ,,,)) coll)


When you call first on the result of the map, it calls first on the result of the filter and filters it then maps it then returns it


assuming you aren't forcing intermediate results, which is kind of the point of lazy seqs

Mario C.19:12:26

Thanks guys, didn't know that


I'm having some trouble adding keys to this nested vector. I need to keep the outermost keys intact. zipmap works for second (first foo)) but not for all the nested vectors in this map. I'll then filter the inner vectors and keep only those without "" . Any ideas?

{:23529ff0 ["AD444D"


@slack.jcpsantiago Not sure what you mean by "adding keys to this nested vector"?


the vector doesn't have any keys it's just strings and numbers. I want to turn it into a map with keywords (not keys sorry)


so instead of ["AD444D"] I have {:id "AD444D"} to make it easier to select and filter


What about the rest of the data in the vector? What should happen to that?


same thing, I have a list of keywords to add to each element


(zipmap [:id :lat :lon :track :altitude :speed :unknown1 :unknown2 :aircraft :unknown3 :unknown4 :start :finish :flight :onground :rateofclimb :unknown5 :unknown6 :unknown7] (second (first bar)))


☝️ works for one vector, but I lose the top-level keyword 😞


OK... So reduce-kv is good for processing a map (eagerly producing a new map)


(reduce-kv (fn [m k v] (assoc m k (zipmap the-keys v))) {} your-map)


using first / second etc. to process a hash-map is pretty much never the right thing


that's what I thought 😓 but didn't know any better


@seancorfield’s solution is perfect here, but in other cases you can use key and val to get the two parts of a map entry


and you should usually either 1) know the key you want or 2) want to do the same thing to every key


In my case the top keywords are always different, but I want to apply the same thing to each one so it's ok


in fact, in the rare case I want the first entry in a hash map I'd do something like (get m (first (keys m))) to hang a lampshade on the fact that I'm doing it on purpose


right - in a map you can never know for sure what the top key will be, except for very rare circusmstances (and even then it's easy to mess it up)


thanks for the insights @noisesmith and the solution @seancorfield


I remember there being a clojurians log, does it still exist?"


google cache ftw 😉


what I last heard is that the logs are out of date

👍 4

Even on Zulip?


your info might be is likely more up to date than mine


If a channel here has the @zulip-mirror-bot in it, all messages are mirrored in real-time to


If a channel here has the @logbot in it, all messages are logged to a system that is behind the clojurians log on ClojureVerse -- however, the indexing/display engine on ClojureVerse was lagging behind, last I checked.