This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2023-12-20
Channels
- # adventofcode (23)
- # announcements (4)
- # babashka (1)
- # beginners (37)
- # biff (2)
- # calva (1)
- # cider (19)
- # clj-kondo (11)
- # clojure (45)
- # clojure-bay-area (2)
- # clojure-europe (12)
- # clojure-nl (1)
- # clojure-norway (15)
- # clojure-uk (2)
- # clojurescript (8)
- # conjure (1)
- # cursive (17)
- # datomic (11)
- # garden (1)
- # graalvm (4)
- # hyperfiddle (21)
- # java (10)
- # jobs (3)
- # lsp (23)
- # off-topic (18)
- # polylith (2)
- # re-frame (4)
- # releases (1)
- # remote-jobs (3)
- # rewrite-clj (4)
- # squint (44)
- # uncomplicate (1)
- # xtdb (84)
Is there a way to make prn-str
deterministic, so that is returns the same string if e.g. an equal map is passed? I observed that the order of keys of a map might switch.
Or is there another way/library to serialize Clojure maps to strings which would meet this criteria?
(Background: I have serialized Clojure maps in DB columns and would need to check if they are equal in SQL).
serializing to json or jsonb, which are pretty widely supported in SQL implementations, may be a better fit for k/v data than strings
the comparison for equality may also be more efficient if you preserve the structure of the data than if you're doing string comparisons
depending on your impl and what modules you have enabled, there may also be non-json column types that support k/v data (e.g. hstore
in https://www.postgresql.org/docs/current/hstore.html)
https://github.com/replikativ/hasch is another library for canonical serialization hashing of Clojure values, but you will likely not be able to rely very much on your SQL engine for equality comparisons if you use it
https://github.com/DotFox/jsonista.jcs this extension for jsonista ensures canonical JSON serialisation. This is not exactly what you asked for but still want to share it just in case you can switch to JSON as a store format
@UCCHXTXV4 it‘s not the identical map, but a map with equal content. Regarding JSONB: It‘s an option, but sometimes some things get lost „in translation“ during a JSON roundtrip (Clojure maps are more powerful than JSON maps), which is why I serialized to a TEXT field. Also I don’t have the need to use SQL to dig into the serialized field.
another way to have some equality token is to use hashing library. For example https://github.com/arachne-framework/valuehash Useful if you don't need to store textual representation for postprocessing
Yeah I think I’m going to use a hash value then. Thanks to all of you who have responded so quickly, I very much appreciate that and it’s what makes this community so awesome!
Isn’t puget deterministic? https://github.com/greglook/puget#canonical-representation
True, that might work as well! Thanks @U051H1KL1
Ah, Puget is just pretty printing, thus only serialization and I would also need deserialization from values stored in the DB.
(defn foo []
@(future
(throw (ex-info "foo" {:bar :baz}))))
(try (foo)
(catch Exception e
(type e)))
;; java.util.concurrent.ExecutionException
Hey team, a bit of a noob question.
I am writing a system which bubbles ExceptionInfo
errors. At the top-level, I catch ExceptionInfo, and based on ex-data
, I provide some user-friendly error messages.
I was surprised to see that if I used futures, exceptions would end up wrapped in an ExecutionException
, so my top-level try-catch would not work.
To solve this, my current solution is to unwrap ExecutionException at the top-level too.
But this made think maybe I was thinking about things the wrong way. What is the clojurian way to "bubble up" certain kinds of exceptions, which we can show to users? What's the right way to do exception handling when dealing with futures et al, which wrap exceptions?your exception is wrapped with java.util.concurrent.ExecutionException but it is not lost. Try calling ex-cause
imho, it is better to always catch Exception instead of ExceptionInfo. Then you can analyse it to react according to the logic of your application.
(let [chain (Exception. "outer" (Exception. "middle" (ex-info "my error" {:data :stuff})))]
(last (take-while some? (iterate ex-cause chain))))
I'm trying to use babashka/http-client to test the performance of my endpoints. I'm not sure I trust the results.
(def request-times (atom []))
(defn perform-request
[request-fn]
(let [start (System/currentTimeMillis)
response-future (request-fn)]
(future
(let [_response (deref response-future)
end (System/currentTimeMillis)]
(swap! request-times conj (- end start))))))
(mapv perform-request (repeat 200 my-req-fn))
Does this seem like a reasonable approach to measuring how long all of the individual requests take?I'd change request times to a atom+map and then use a unique id for each request, add the start time and use a callback to add the end time. Then at the end, collect all the deltas
@U04V15CAJ for the callback part, do you mean through :async-then
? If I'm passing in the collection of req-fns using map-indexed to generate the unique id, I'm not sure how I should get the unique ids into the :async-then
callback fn.
Ah, okay. Yes, I can make it work like that. I do wonder if the cost of the extra future is significant enough to merit re-writing it. Could the extra future skew the results?
@U04V15CAJ Alright. Thanks for suggesting an alternate implementation here!