Fork me on GitHub
#beginners
<
2020-10-25
>
lech4310:10:39

I have a question about spec. I'd like to check whether infinite lazy seq is valid. But this code is infinite loop. What should I do?

(s/valid? (s/coll-of int?) (range))

schmee10:10:15

how could you check an infinite sequence? you’d have to check every element, which by definition would take an infinite amount of time

lech4310:10:16

I just want to check 100 of them. So, I've used it like this so far, but it's too messy.

(s/def ::coll #(valid? (s/coll-of int?) (take 100 %)))

(s/fdef func
  :args (s/cat :coll ::coll)
  :ret ::coll)

Alex Miller (Clojure team)13:10:29

s/every and s/every-kv do this already

lech4322:10:14

That's interesting. I thought it was the same thing because s/coll-of called s/every. I didn't know there was a ::conform-all in coll-of. Thanks

Alex Miller (Clojure team)13:10:12

They will check a bounded sample (up to 100 by default)

Alex Miller (Clojure team)13:10:39

So just (s/def ::coll (s/every int?)) is exactly same as above (actually better for gen etc)

Marek Jovic17:10:38

Hello, how to flatten vector of vectors of maps to have vector of those maps?

andy.fingerhut17:10:13

This sample REPL session has integers instead of maps, but the code works regardless of whether the bottom thing is maps or any other type:

user=> (mapcat identity [[1 2] [3 4] [5 6]])
(1 2 3 4 5 6)
user=> (vec (mapcat identity [[1 2] [3 4] [5 6]]))
[1 2 3 4 5 6]

👍 3
jsn17:10:53

apply concat does that

dumrat18:10:09

cljs::formula.events=> (int (char 97)) ==> 0 I'm having issues converting char to int on my cljs repl. This should not result in 0 right?

dpsutton18:10:34

(int "a")
WARNING: cljs.core/bit-or, all arguments must be numbers, got [string number] instead at line 1 <cljs repl>
0
> (doc of int) Coerce to int by stripping decimal places.

hiredman18:10:58

(yet another clj and cljs difference)

dpsutton19:10:04

This is more a host language difference or vm difference to me

dpsutton19:10:33

There is no char type in js. And on the jvm chars exist and are ints

hiredman19:10:08

they are not ints

dpsutton19:10:01

> An int value represents all Unicode code points, including supplementary code points.

dpsutton19:10:06

Is what I’m going from

andy.fingerhut19:10:56

The JVM type system has distinct types for char and short, even though the values of those two types have a one-to-one correspondence to each other.

Lennart Buit20:10:18

on that note, how does it store chars with codepoints >64k. I know that Java uses UTF-16 encoding, but I guess you can’t store an (f.e.) emoji in a char ?

Lennart Buit20:10:04

(more, just bewonderment than a real problem I’m struggling with :p)

Alex Miller (Clojure team)20:10:14

there are apis that understand this (and some older ones that do not, so some care is required)

andy.fingerhut20:10:34

I haven't read this full wikipedia page on utf-16 to see how good of a job it does explaining this, but there is a range of 16-bit values called "surrogates" in UTF-16 encoding of Unicode, such that a pair of surrogates can represent all Unicode code points that do not fit in a single 16-bit value: https://en.wikipedia.org/wiki/UTF-16

Lennart Buit20:10:14

Yeah right. Reading up on it (and what Alex says), if you charAt a char requiring 4 bytes to represent, you get half of it. Thanks 🙂!

Alex Miller (Clojure team)20:10:57

the codepoint apis understand that stuff

Lennart Buit20:10:31

Wondering how that came to be, but apparently surrogate pairs were not a thing when Java was first released. The more you know!

Alex Miller (Clojure team)20:10:22

yeah, wasn't introduced until much later

andy.fingerhut20:10:26

My understanding is that when Unicode first started, they thought 16 bits would be enough for all code points. Java and a few other software systems started with UTF-16 with no need for surrogate pairs, then later added them when 16 bits was no longer enough.

andy.fingerhut20:10:09

The "History" section of the Wikipedia article on UTF-16 summarizes a few main points of who did what, although not necessarily exactly when (although the cited references might)

Lennart Buit20:10:52

The history of computing is often fascinating 🙂 E.g.: we tried x, that turned out to be an oversimplification, then y and z — also an oversimplification, so now we started over with a, with little bits of legacy x, y and z in there.