Fork me on GitHub
abhinav omprakash09:10:21

hi, I had a question about parallelism. suppose I have a bunch of functions, and I want to run all of them on separate threads (1 thread per fn), • I don't care about the return value, I'm running them for their side effects. • There's no shared state between the fns • I want the thread to clean up after itself. • I need it to run immediately when called. what do I need? thread, future, promise, or something else?


Clojure fns implement runnable, which means you can construct a thread with it and call .run on that.

(defn run-in-thread [f]
  (.start (Thread. f)))

(defn print-after [n]
  #(do (Thread/sleep (* n 1000))
       (println n)))

(run! run-in-thread (map print-after (range 8)))
run! is map but purely for side-effects, discarding return values. Edit: (map print-after (range 8)) is an example bunch of functions. > I want the thread to clean up after itself. What do you mean by this? Another thing to note, is the fact that you'll of course be creating n threads, where n is the number of functions you have. That may or may not be a good idea.

abhinav omprakash15:10:32

> I want the thread to clean up after itself. I meant that it shouldn't just stay there consuming resources (I'm not fully aware about how threads work) I didn't know about run! also, wouldn't futures do the same thing? I did a little reading and I think futures might be what I need. Is there a more elegant solution than using n threads? I want all n functions to run concurrently irrespective of how long they take to run, i.e I would want them to start running at a given time. That's why n threads


You can very well use futures. Futures are tasks (Callables like functions) submitted to a threadpool (Executors.newCachedThreadPool to be specific). This threadpool will reuse previously spawned threads if they haven’t already been cleaned up but won’t limit the the number of threads, so none of the tasks will be waiting in the pool. Futures are slightly more “managed” than raw threads. If you don’t care about their return values (fire and forget) and the number of functions is large then using a capped threadpool is better. Otherwise you’ll get OOM errors if you’re spawning a lot of threads. Take a look at or simply Java Executor service which the former library is built on.

💯 4
abhinav omprakash04:10:58

@UMPJRJU9E thanks for the explanation, I appreciate it.

abhinav omprakash04:10:35

what would be a good estimate of the number of threads that would most likely cause me an error, I know that it is dependent on the CPU, but a rough estimate would be nice


That would depend on multiple of factors like the configured max heap size, the size of the objects the threads are dealing with and how quickly the objects can be garbage collected. Better if you experiment with the numbers yourself by looking at the heap usage, garbage collection logs/graph, the state of the threads (waiting, parked or running) using tools like VisualVM (free) or YourKit (rave reviews, expensive, haven’t tried it myself).

💯 2
Andrei Stan10:10:06

hello everyone, i have two vectors of maps, and i want to return one of the vectors, but with the values of the second for IP, or whatever i need...

(let [vecA [{:name "Home Desktop" :IP "" :id "hhhhhhhhh"} {:name "Work Station" :IP "" :id "ggg"}]
      vecB [{:name "Home Desktop" :IP ""} {:name "Work Station" :IP "" }]]
The expected result:
[{:name "Home Desktop" :IP "" :id "hhhhhhhhh"} {:name "Work Station" :IP "" :id "ggg"}]
I tried to use map function in some ways without success.

abhinav omprakash10:10:10

you can use update-in. (update-in vecA [1 :IP] (fn [_] "")) 1 refers to the index of the map you want to change, :IP refers to the key of the map you want to change. update-in takes a function which it calls to get the new value replace "" with whichever value you need


map with multiple arguments will do the trick


  (fn [a b]
    (assoc a :IP (:IP b)))


(let [vecA [{:name "Home Desktop" :IP "" :id "hhhhhhhhh"} {:name "Work Station" :IP "" :id "ggg"}]
      vecB [{:name "Home Desktop" :IP ""} {:name "Work Station" :IP ""}]]
   (fn [a b]
     (assoc a :IP (:IP b)))
=> ({:name "Home Desktop", :IP "", :id "hhhhhhhhh"} {:name "Work Station", :IP "", :id "ggg"})


mapv if you want to preserve the vector-ness


if they are ordered

(let [vecA [{:name "Home Desktop" :IP "" :id "hhhhhhhhh"} {:name "Work Station" :IP "" :id "ggg"}]
      vecB [{:name "Home Desktop" :IP ""} {:name "Work Station" :IP "" }]]
  (map into vecA vecB ))

Marius Kreis06:10:48

@U3JH98J4R This only works if both vectors are complete (contain all items in the same order), right?


I'm trying to use httpkit to post to a amazon lambda url and it throws this:

Unrecognized record version (D)TLS-0.0 , plaintext connection?
do you know?


doing the same with curl works for some reason

Ben Sless11:10:30

which version of httpkit?



Ben Sless11:10:08

are you using the sni client or regular client?


hm I require like this [org.httpkit.client :as client]


probably regular if I don't know

Ben Sless11:10:18

try to require org.http.sni-client as sni

Ben Sless11:10:50

then assoc to the request in :client @sni/default-client


yeee bois that works thanks 🎉 😄

🎉 1
Ben Sless11:10:22

which JVM version were you using?

Ben Sless11:10:54

Also, brief note on how I recognized the issue for other readers and our future selves, TLS issues can occur if the client has no ssl configurer, Docstring from the namespace

Provides an SNI-capable SSL configurer and client, Ref. #335.
  In a separate namespace from `org.httpkit.client` so that
  http-kit can retain backwards-compatibility with JVM < 8.
The sni client:
Like `org.httpkit.client/default-client`, but provides SNI support using `ssl-configurer`. NB Hostname verification currently requires Java version >= 11.

👍 1
Ben Sless11:10:11

so the first thing to check with TLS is the SNI client. If you have more issues with TLS is might be the list of cyphers you're using

Ben Sless11:10:32

Also, you should probably switch to Java 17, it's LTS

👀 1
Ben Sless11:10:42

but that's besides the point


Hi, I'm currently working on transforming deeply nested maps into hiccup-html. Input:

{:val "Tree 1", :children [{:val "var1", :children
                             [{:val "nested1", :children []}
                              {:val "nested2" :children []}]}
                           {:val "var2", :children
                             [{:val "child of var2", :children []}]}]}
 [:li "Tree1"]
  [:li "var1"]
   [:li "nested1"]
   [:li "nested2"]]
  [:li "var2"]
   [:li "child of var2"]]]]
Current solution:
(defn render-tree [{:keys [val children]}]
   [:li val]
   (for [child children]
     (render-tree child))])
The above solution does exactly what I want it to, but -- if I understand recursion in Clojure correctly -- will cause a stack overflow on deeply nested lists. What should I read up on to be able to design a better solution than this?


i’m not certain there is a tail recursive version of this to be made without getting creative (in any language)


buuuut, one idea could be to “explode” your nested data


lazy comes to mind...


{ [0]     "Tree 1"
  [0 0]   "var1"
  [0 0 0] "nested1"
  [0 0 1] "nested2"
  [0 1]   "var2"
  [0 1 0] "child of var2" }


like if you could get it to this (which should be possible) you couldbe more clever about building the data structure - maybe building “inside out”


but for reasonable data sizes idk if its gonna be worth it (or possible, i’m just guessing here). Even clojure.walk doesn’t do anything like this


(defn explode-nested-data [data]
  (let [data [data]]
    (loop [[position & todo] [[0]]
           result         {}]
      (if (nil? position)
        (let [node (get-in data (interpose :children position))]
          (recur (concat (map (fn [n] (conj position n))
                              (range (count (:children node))))
                 (assoc result position {:val (:val node)})))))))
=> #'dev.mccue.mtgbot/explode-nested-data
{[0] {:val "Tree 1"},
 [0 0] {:val "var1"},
 [0 0 0] {:val "nested1"},
 [0 0 1] {:val "nested2"},
 [0 1] {:val "var2"},
 [0 1 0] {:val "child of var2"}}


here is a crack at the first one - i get nervous with concat but I think this is a valid use of it


I think you're right and exploding the data is probably the most feasible way to do this (with the added benefit of making it simpler to update the structure). Thank you for your input 🙂


(def exploded {[0] {:val "Tree 1"},
               [0 0] {:val "var1"},
               [0 0 0] {:val "nested1"},
               [0 0 1] {:val "nested2"},
               [0 1] {:val "var2"},
               [0 1 0] {:val "child of var2"}})
(defn implode-data-back
  (let [sorted-by-nesting (sort-by (fn [[k _]] (count k))
    (loop [[[nest {:keys [val]}] & remaining] sorted-by-nesting
           built-up-data                 [[]]]
      (if (nil? nest)
        (first built-up-data)
        (recur remaining
               (update-in built-up-data
                          (cons (first nest) (map #(+ 2 %) (rest nest)))
                          (fn [data]
                            (conj (or data []) :ul [:li val]))))))))

(implode-data-back exploded)
 [:li "Tree 1"]
 [:ul [:li "var1"] [:ul [:li "nested1"]] [:ul [:li "nested2"]]]
 [:ul [:li "var2"] [:ul [:li "child of var2"]]]]


this….might do it


That is a lot more sophisticated than what I was doing. And more necessary than I thought. (I tried just making [:ul>ul>ul>li val] constructs with the times ul appears being the length of the vector, but I need to connect the list items with lines so…) Thank you so much for this solution.


definitely ask a smarter person - i’m sure there is a way to do this that isn’t as horrible


I guarantee you that it is of higher quality than the rest of the codebase XD


more clever != higher quality


also true I'm still struggling to understand how implode-data-back guarantees the correct order of the nested items 😅


And why map + 2? Is that relative to the maximal nesting?


because [:ul [:li "…"] <HERE>]


so if its 0 0, you need to put the data in slot 2 there


making nested1 and nested2 be in the same ul is going to be fun.


I'm not totally convinced that you'll get a stack overflow from your original code @U02FM0NNZAB... for is lazy (in fact what you've written there is the same as (map render-tree children)) and you're returning a lazy data structure that will call render tree when realised. I would think that means that you're only calling into render-tree from outside render-tree ... if that makes sense? ..


(defn render-tree [{:keys [val children]}]
     [:li val]
     (for [child children]
       (render-tree child))])

  (defn build-tree [acc n]
    (if (pos? n)
      (recur {:val "val" :children [acc]} (dec n))

  (count (render-tree (build-tree {:val "end"} 1000000)))
that takes a few seconds to run on my machine, but it succeeds


I think that's what @UP82LQR9N was alluding to earlier in the thread


@U0P0TMEFJ Thank you for testing that. I'm still struggling to understand the practical implications of laziness…


👍 ... it's easy enough to test at the repl 😉


yeah, but I was too caught up in thinking that it wouldn't work anyway that I never actually tested whether it fails so, thank you for that


I tested it now and since I need to render something from the output it causes a stackoverflow. The maximum n that I can pass to build-tree is 192 193. Not sure if I can somehow split it up into chunks to prevent that. (or if that is even relevant for my use case)


I was able to increase n to 217 by wrapping it in lazy-seq:

(defn render-tree [{:keys [val children]}]
  (into [] (lazy-seq
             [:li val]
             (map render-tree children)])))


I would think that that call to lazy-seq isn't doing anything, because you're calling into which will eagerly consume the lazy seq. The lazyness in there is due to map returning a lazy seq. So that code should be equivalent to

(defn render-tree [{:keys [val children]}]
     [:li val]
     (map render-tree children)])


You're right. Without the into it goes up to 223. (but it won't render correctly without the into since hiccup can't render lazy-seqs)


I don't think that the into + lazy-seq is doing anything different to just returning a vector with [:ul ,,,] ... lazy-seq will just turn the vector it's given into a lazy seq and the (into [] ,,,) will turn that lazy-seq back into a vector. These are not recursive things, and will not change the individual elements, so the third element will still be a lazy seq (returned by map) either way. If it needs to be a vector, then you can use mapv which will eagerly generate a vector instead of returning a lazy-seq and will probably introduce the recursion problem you were worried about 😉


I really don't know what conclusion I should draw from this. Pray no user ever generates a tree larger than 217 nodes?


I think I'll just use the recursive solution for now since it at least works and lets me test the rest of my structure. I suspect that it's possible to implement a more robust solution with clojure.walk/postwalk so I'll put that in my todos.


I'm not sure where your 217 limit is coming from ... The example code I posted above has a nesting of 1000000, so if you're getting a stack overflow I'd suggest it's from something else?


Or you've simplified the code posted here too much?


I haven't simplified it but count does not realize the entire seq. <-- at least it doesn't in cljs


Huh. It's cljs specific


So js allows less linear recursion than the jvm I don't think I'll be able to outsource generating the html to the backend since it needs to incorporate re-frame functions in the final form.


I'd be surprised if count didn't realise the whole seq ... either it knows how big the seq is cos it's already realised, like a vector, or it needs to find out how many elements are in the seq by realising ... right?


But it's nested, right? And count doesn't count down to the last level.

(count [1 [2 3 [4]]])
;; => 2


So it's inefficient to realize the entire seq just to count it's elements


It only needs to know how many are in the top level.


yeah .. count realised the whole seq ... but doesn't touch any of it's elements


I don't think we've got to the bottom of where the limit is coming from


what's the actual error you're getting?


too much recursion in the js/console


is there a stack trace to go along with that?


... sorry I don't do a lot of cljs, so I may well be asking a dumb question ...


action-request-action failed 
Object { meta: null, cnt: 3, shift: 5, root: {…}, tail: (3) […], __hash: null, "cljs$lang$protocol_mask$partition0$": 167666463, "cljs$lang$protocol_mask$partition1$": 139268 }
 InternalError: too much recursion
    cljs$core$array_map_index_of core.cljs:6631
    cljs$core$ILookup$_lookup$arity$3 core.cljs:6916
    cljs$core$ILookup$_lookup$arity$2 core.cljs:6913
    cljs$core$IFn$_invoke$arity$2 core.cljs:1955
    cljs$core$IFn$_invoke$arity$1 core.cljs:3331
    cljs$core$pr_sequential_writer core.cljs:10053
    cljs$core$IPrintWithWriter$_pr_writer$arity$3 core.cljs:10378
    cljs$core$_pr_writer core.cljs:776
    cljs$core$pr_writer_impl core.cljs:10124


I hate cljs errors


😉 ... people say that about the clj errors too ... but tbh I used to work with jboss, so I kinda find them an improvement over that


I'm okay with Java Stacktraces since I learned programming with Java but js is something else


so the error is coming from the printer?


because I generated it from the repl probably


can you print the incoming data structure?


I mean the thing you pass to render-tree


it should have the same depth of data structure ... right?


no I can't same error


but build-tree isn't mundane recursion right?


does cljs have tco?


the build-tree fn I posted above uses recur to do tail call recursion which basically transforms the recursive form into a loop so you don't consume the stack when you use it ... I think that's supported in cljs


I think that's the only way to do tco in both clj and cljs


not sure if mutual recursion via trampoline counts as well, but that's the only one I know of as well


but build-tree failing seems to suggest that even if I managed to get an iterative implementation of render-tree it would still cause the same error…


I wonder if it's the printer that's the problem


the data returned by build-tree is eagerly generated - there's no lazyness there


if instead of trying to print the result, you render it in the browser or something does the error go away?


error when calling lifecycle function teapot.core/mount-components InternalError: too much recursion
    cljs$core$array_index_of_keyword_QMARK_ core.cljs:6581
    cljs$core$array_index_of core.cljs:6618
    cljs$core$array_map_index_of core.cljs:6631
    cljs$core$ILookup$_lookup$arity$3 core.cljs:6916
    cljs$core$ILookup$_lookup$arity$2 core.cljs:6913
    cljs$core$IFn$_invoke$arity$2 core.cljs:1955
    cljs$core$IFn$_invoke$arity$1 core.cljs:3331
    cljs$core$pr_writer core.cljs:10204
    print_prefix_map core.cljs:10327


I'll try to reproduce this with less dependencies in a new project. Maybe it's a library in between.


although if it fails in the repl


(defn build-tree [acc n]                                                                                                                                                                                           
    (if (pos? n)                                                                                                                                                                                                     
      (recur {:val "val" :children [acc]} (dec n))                                                                                                                                                                   
  (defn consume-tree                                                                                                                                                                                                 
     (consume-tree t 0))                                                                                                                                                                                             
    ([t c]                                                                                                                                                                                                           
     (if (:children t)                                                                                                                                                                                               
       (recur (first (:children t)) (inc c))                                                                                                                                                                         
  (consume-tree (build-tree {:val "end"} 1000000))     
If I run that at a cljs repl, it prints the number 1000000 ... which suggests to me that there's not too much recursion in generating and consuming the data structure. However,
cljs.user=> (build-tree {:val "end"} 1000000)
Execution error (RangeError) at (<cljs repl>:1).
Maximum call stack size exceeded
that suggests to me that the cljs printer is using recursion to print the nested data structure at the repl ... so I'd suggest the problem isn't in the code we've been talking about 😉


And something inside my frontend stack uses the printer to produce html leading to the error.


But someone else must have had this problem before, right? I can't be the only one working with deeply nested trees in cljs.


I would think so ... maybe ask in #clojurescript???


I'll ask in #clojurescript tomorrow how I should go about debugging this. I think I'm done for today, my head hurts. Thank you for your help!


👍 ... I think you have a pretty minimal case ... good luck 😉

Max Deineko22:10:48

You got me interested 🙂. Please keep in mind that I'm perpetual clojure beginner, so everything I write here should be taken with caution. First, I cannot currently see how lazyness would possibly avoid potential stack overflows here -- afaics as soon as you need to realize/process the tree structure the stack will be used just as without lazyness, since there are no parts of lazy sequences one can possibly discard. So I started out with

(ns my.scratchpad
   [clojure.pprint :as pp]
   [clojure.walk :as walk]))

(def small-tree
  {:val "Tree 1", :children [{:val "var1", :children
                              [{:val "nested1", :children []}
                               {:val "nested2" :children []}]}
                             {:val "var2", :children
                              [{:val "child of var2", :children []}]}]})

(defn build-tree [acc n]
  (if (pos? n)
    (recur {:val "val" :children [acc]} (dec n))

(def big-tree (build-tree {:val "end"}  100000))
and a barebones tail recursive depth first search -- which would only visit and output the nodes without care for nested structure:
(defn dfs
  "Tail recursive depth first search.
  Value of `trees` represents stack of not yet visited (sub)trees."
  [out trees]
  (if (empty? trees) out
      (let [t    (peek trees)
            ts   (pop trees)
            v    (:val t)
            cs   (:children t)
            ts'  (into ts (reverse cs))
            out' (conj out v)]
        (recur out' ts'))))

(dfs '() [small-tree])
;; => ("child of var2" "var2" "nested2" "nested1" "var1" "Tree 1")
Now we only need to add the nesting info to above traversal and build the output accordingly:
(defn transform-tree
  "Depth first search with structure-preserving output construction."
  (letfn [(annotate [cs] (conj (into [:start] (map #(do {:tree %}) cs)) :end))
          (dfs [result stack]
            (if (empty? stack) result
                (let [e (peek stack)
                      stack' (pop stack)]
                  (cond (= e :start)
                        (recur (conj result [:ul]) stack')

                        (= e :end)
                        (let [[v2 v1] (into [] (take 2 result))]
                          (recur (conj (pop (pop result)) (conj v1 v2)) stack'))

                        (let [t (:tree e)
                              cs (:children t)
                              result' (conj (pop result) (conj (peek result) [:li (:val t)]))
                              stack'' (if (empty? cs) stack' (into stack' (reverse (annotate cs))))]
                          (recur result' stack''))))))]
     (dfs '() (list :start {:tree tree})))))

(pp/pprint small-tree)
;; {:val "Tree 1",
;;  :children
;;  [{:val "var1",
;;    :children
;;    [{:val "nested1", :children []} {:val "nested2", :children []}]}
;;   {:val "var2", :children [{:val "child of var2", :children []}]}]}

 (transform-tree small-tree))
;; [:ul
;;  [:li "Tree 1"]
;;  [:ul
;;   [:li "var1"]
;;   [:ul [:li "nested1"] [:li "nested2"]]
;;   [:li "var2"]
;;   [:ul [:li "child of var2"]]]]

(time (let [_ (transform-tree big-tree)] nil))
;; "Elapsed time: 220.219634 msecs"
Afaics this builds the tree as needed. The problem now is that neither clojure's print nor pprint can process the big tree:
;; (pp/pprint (transform-tree big-tree))
;; StackOverflowError
so you'd probably need to roll your own as well, as in e.g.
(defn tree->string
  "Don't build it, print it."
  (letfn [(annotate [cs] (conj (into [:start] (map #(do {:tree %}) cs)) :end))
          (dfs [result stack]
            (if (empty? stack) result
                (let [e (peek stack)
                      stack' (pop stack)]
                  (cond (= e :start)
                        (recur (str result "[:ul") stack')

                        (= e :end)
                        (recur (str result "]") stack')

                        (let [t (:tree e)
                              cs (:children t)
                              result' (str result "[:li " (:val t) "]")
                              stack'' (if (empty? cs) stack' (into stack' (reverse (annotate cs))))]
                          (recur result' stack''))))))]
    (dfs ""  (list :start {:tree tree} :end))))

 (tree->string small-tree))
;; [:ul[:li Tree 1][:ul[:li var1][:ul[:li nested1][:li nested2]][:li var2][:ul[:li child of var2]]]]

   (count (tree->string big-tree))))
;; => 1400014
;; Elapsed time: 75058.648122 msecs
As far as I can see clojure.walk also uses call stack for recursion
(walk/postwalk (fn [_] {}) small-tree)
;; => {}

;; (walk/postwalk (fn [_] {}) big-tree)
;; StackOverflowError
This was at one point also my (limited) experience with most popular clojure json libraries, which choked on deeply nested structures. While I have no experience with clojurescript, my takeaway for now would generally be that if you really care about possible breakage on deeply recursive data in clojure & friends, you'll need to test for it and expect that not only your code will fail but possibly common libraries as well. Now I'm in no way an expert on parsing or tree manipulation, so maybe there are very elegant and/or powerful solutions to the problem which are not subject to clojure's call stack limitation; it's also possible that I made some horrible mistakes above. In any case I'd be delighted to learn more on the topic.


> afaics as soon as you need to realize/process the tree structure the stack will be used just as without lazyness, since there are no parts of lazy sequences one can possibly discard. When we call render-tree, we get a vector back containing [:ul [:li "something"] *lazy-seq*] where the third element is a lazily realised thing. render-tree is no longer on the stack, so when we try to realise the third element, it doesn't matter that it calls render-tree again because we've already returned from the previous call. You could prove that by printing the stack trace every time you call render-tree.

(defn build-tree [acc n]
    (if (pos? n)
      (recur {:val "val" :children [acc]} (dec n))

  (defn render-tree [{:keys [val children]}]
    (.printStackTrace (Exception. ""))
     [:li val]
     (map render-tree children)])

  (defn consume-rendered-tree
     (consume-rendered-tree t 0))
    ([t c]
     (if (nth t 2)
       (recur (first (nth t 2)) (inc c))

  (consume-rendered-tree (render-tree (build-tree {:val "end"} 3)))
The relevant bit of the stack trace is
	at user$render_tree.invokeStatic(NO_SOURCE_FILE:79)
	at user$render_tree.invoke(NO_SOURCE_FILE:79)
	at clojure.core$map$fn__5866.invoke(core.clj:2753)
	at clojure.lang.LazySeq.sval(
	at clojure.lang.LazySeq.seq(
	at clojure.lang.LazySeq.first(
	at clojure.lang.RT.first(
	at clojure.core$first__5384.invokeStatic(core.clj:55)
	at clojure.core$first__5384.invoke(core.clj:55)
	at user$consume_rendered_tree.invokeStatic(NO_SOURCE_FILE:90)
	at user$consume_rendered_tree.invoke(NO_SOURCE_FILE:85)
	at user$consume_rendered_tree.invokeStatic(NO_SOURCE_FILE:87)
	at user$consume_rendered_tree.invoke(NO_SOURCE_FILE:85)
where you see only one call to render-tree as it's realising the lazy seq. Does that make sense?

Max Deineko14:10:25

@U0P0TMEFJ I should have said «there are no parts of lazy sequences one can possibly discard in general» -- consume-rendered-tree above is very specific in that it goes down one tree branch without caring about the rest. If we need a more general consumer -- say, if we wanted to print the tree (as described in original post or as xml structure), concatenate all contained values etc -- then lazyness does not give us any benefit, the evaluation will still overflow the stack if applied to the nested lazy structure naively. But maybe we're solving different problems: @U02FM0NNZAB note that

 [:li "Tree1"]
  [:li "var1"]
   [:li "nested1"]
   [:li "nested2"]]
  [:li "var2"]
   [:li "child of var2"]]]]
and what
(defn render-tree [{:keys [val children]}]
   [:li val]
   (for [child children]
     (render-tree child))])
(or the lazy map variant) yields are not the same. Now, just computing render-tree will not result in a stack overflow. But if you want to process the tree structure further, generally laziness will not help with the particular problem of stack exhaustion afaics -- for example, if you wanted to print the structure as in the former snippet -- you'll need to move the nesting from the stack to memory somehow, e.g. like in tail recursive tree traversal.


ah yes ... right ... getcha ... 😉 ... I think I failed to understand that you were talking purely about the rendering part of the problem ... apologies ... I think I read the original question as "generating this data structure gets me a stack overflow" and I think that's not true, it's consuming that data structure that creates the stack overflow, and I think your point is that the only sensible way to consume such a datastructure is recursive and that seems fair enough 😉

Max Deineko17:10:42

@U0P0TMEFJ yes, above I compute "Output" from "Input" of the original post avoiding possible stack overflow caused by recursion -- the potential overflow problem stays the same whether we compute "Output" from "Input" or from "current solution". Hope it helped @U02FM0NNZAB 🙂

👍 1

I sadly don't have time right now to continue working on this particular problem, but thank you for your input. I'll save it for later when I have time for it. (For now I just imposed a limit on the depth of the nesting to keep the site from crashing)

👍 1
🤞 1

how do I escape newlines in format like this? use case is to have small lines in code but the format string should be 1 line

(format "lorem lorem lorem lorem lorem lorem
lorem lorem lorem lorem lorem ")

Martin Půda14:10:29

(clojure.pprint/cl-format nil "lorem lorem ~
lorem lorem ~
lorem lorem")


or uuhh

(format (str "lorem lorem lorem lorem lorem " 
             "lorem "
             "lorem lorem lorem lorem"))

☝️ 1

as long as you don't mind the str conversion.


trying to read .edn with #time/instant data in it, receiving No reader function for tag time/instant , any ideas fellow Clojurians?


Do both clojure.core/read and clojure.edn/read fail?

Darin Douglass16:10:32

the built-in macro for dates is #inst which maps to java.util.Date. #time/instant must be provided by another library that’s not on your classpath


Yes, it’s converted using tick.alpha.api/instant


core/read fails as well


read and edn/read both provide a way to handle unknown tagged readers, I forget the details, check their doc strings


Reader tags without namespace qualifiers are reserved for Clojure. Default reader tags are defined in but may be overridden in `data_readers.clj` or by rebinding If no data reader is found for a tag, the function bound in will be invoked with the tag and value to produce a value. If default-data-reader-fn is nil (the default), a RuntimeException will be thrown.




Solved with customer reader and read-string instead of edn/read-string

(binding [*data-readers* {'time/instant t/instant}]
    (read-string ...))


Thanks guys!


What's gross?


Using tag literals for type specific stuff, the way it just sort of exposes the java.time types as tag literals


Instant should absolutely just use the inst tag


I think the case could be made that some of the zoned stuff should as well, but it is admittedly murky there


The reader functions appear to be bad as well, they return forms that eval to the data structure instead of the data structure, so don't work entirely in the reader and need to pass through eval to work


Agree there is some overlap, but just for instant vs date. And even then one might want to distinguish.


Outside of being broken for pure serialization, you'll get weird results in things like macros (which are a thorny issue with nonedn literals anyway)


The readme talks about the problem... A bug in cljs.


Which I have a patch for.. but haven't managed to get much interest in


Does that answer the question of why I called it gross?


The alternative is to have 2 artifacts for this lib. One for jvm and one for cljs


It seems like you are trying to convince me it isn't gross


So your point re serialisation is valid


Meaning, there is an issue that should be fixed


At the time I said gross I hadn't even really seen any of that


Ok so just having some overlap between instant and date tag renders whole library gross


I called it gross pretty much entirely based on the fact it is basically exposing type specific tags, instead of creating / reusing tags based on the meaning of the information (using inst)


@eulered გაუმარჯოს!


სალამი გიგა 😀

Sebastian Allard20:10:32

Are there any web based data browsers? Like REBL but starting a HTTP server instead. I found a project called which seems to be what I'm looking for, but it is not actively maintained. Do you know any alternatives?

R.A. Porter20:10:38

Portal might fit your needs -

👍 3

@U02HNEY0CPN I used to use REBL all the time, then switched to Reveal, but now I use Portal all the time for this. And if you're using VS Code, there's a Portal extension for it so you can run it inside VS Code in a webview so you don't need an external browser window.

👍 2