This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
- # announcements (1)
- # architecture (3)
- # beginners (31)
- # calva (61)
- # cider (1)
- # clojure (43)
- # clojure-dev (17)
- # clojure-europe (85)
- # clojure-uk (8)
- # clojurescript (31)
- # cryogen (2)
- # cursive (7)
- # data-science (12)
- # datalog (1)
- # datomic (4)
- # defnpodcast (1)
- # figwheel-main (11)
- # fulcro (32)
- # hoplon (1)
- # leiningen (1)
- # malli (47)
- # pedestal (1)
- # rdf (2)
- # re-frame (11)
- # reagent (4)
- # reitit (7)
- # shadow-cljs (22)
- # vrac (8)
- # xtdb (2)
I have a question about spec. I'd like to check whether infinite lazy seq is valid. But this code is infinite loop. What should I do?
(s/valid? (s/coll-of int?) (range))
how could you check an infinite sequence? you’d have to check every element, which by definition would take an infinite amount of time
I just want to check 100 of them. So, I've used it like this so far, but it's too messy.
(s/def ::coll #(valid? (s/coll-of int?) (take 100 %))) (s/fdef func :args (s/cat :coll ::coll) :ret ::coll)
That's interesting. I thought it was the same thing because s/coll-of called s/every. I didn't know there was a ::conform-all in coll-of. Thanks
So just (s/def ::coll (s/every int?)) is exactly same as above (actually better for gen etc)
This sample REPL session has integers instead of maps, but the code works regardless of whether the bottom thing is maps or any other type:
user=> (mapcat identity [[1 2] [3 4] [5 6]]) (1 2 3 4 5 6) user=> (vec (mapcat identity [[1 2] [3 4] [5 6]])) [1 2 3 4 5 6]
cljs::formula.events=> (int (char 97)) ==> 0
I'm having issues converting char to int on my cljs repl. This should not result in 0 right?
> (doc of int) Coerce to int by stripping decimal places.
(int "a") WARNING: cljs.core/bit-or, all arguments must be numbers, got [string number] instead at line 1 <cljs repl> 0
> An int value represents all Unicode code points, including supplementary code points.
The JVM type system has distinct types for
short, even though the values of those two types have a one-to-one correspondence to each other.
on that note, how does it store chars with codepoints >64k. I know that Java uses UTF-16 encoding, but I guess you can’t store an (f.e.) emoji in a
there are apis that understand this (and some older ones that do not, so some care is required)
I haven't read this full wikipedia page on utf-16 to see how good of a job it does explaining this, but there is a range of 16-bit values called "surrogates" in UTF-16 encoding of Unicode, such that a pair of surrogates can represent all Unicode code points that do not fit in a single 16-bit value: https://en.wikipedia.org/wiki/UTF-16
Yeah right. Reading up on it (and what Alex says), if you charAt a char requiring 4 bytes to represent, you get half of it. Thanks 🙂!
Wondering how that came to be, but apparently surrogate pairs were not a thing when Java was first released. The more you know!
My understanding is that when Unicode first started, they thought 16 bits would be enough for all code points. Java and a few other software systems started with UTF-16 with no need for surrogate pairs, then later added them when 16 bits was no longer enough.
The "History" section of the Wikipedia article on UTF-16 summarizes a few main points of who did what, although not necessarily exactly when (although the cited references might)