Fork me on GitHub
#clojure-spec
<
2017-02-06
>
Yehonathan Sharvit06:02:18

I’m trying to understand the difference between s/coll-of and s/every. > Note that 'every' does not do exhaustive checking, rather it samples coll-check-limit elements. Nor (as a result) does it do any conforming of elements. 'explain' will report at most coll-error-limit problems. Thus 'every' should be suitable for potentially large collections.

Yehonathan Sharvit06:02:07

What does it mean that every doesn’t do any conforming of elements?

Yehonathan Sharvit06:02:41

(s/conform (s/every number?) (range 1000)) seems to do the conforming pretty well!

dergutemoritz08:02:07

@viebel Try with a spec that conforms to a different value

Yehonathan Sharvit08:02:22

Like this:

(s/conform (s/every (s/or :n number?
                          :s string?))
           (concat (range 1000)))

dergutemoritz08:02:15

Yeah that should work

Yehonathan Sharvit08:02:28

If I understand correctly s/every leaves the data as-is

Yehonathan Sharvit08:02:45

while s/coll-of conforms it

dergutemoritz08:02:56

It will only check whether the values match the spec

kassapo09:02:38

I cannot find the correct way to call clojure.spec.test/check on my function that might read a file `

kassapo09:02:44

(s/def :read/file #(instance? File %))
(s/def :read/content string?)
(s/def :read/fail nil?)

(defn read-file [^File f] (when (.isFile f) (slurp f)))

(s/fdef read-file
  :args (s/cat :f :read/file)
  :ret (s/or :read/content :read/fail)
  :fn (fn [{{f :f} :args ret :ret}]
        (if (.isFile f)
          (s/valid? :read/content ret)
          (s/valid? :read/fail ret))))

(defn gen-file []
  (gen/elements (map (fn [^String f] (File. f))
                     ["./project.clj" "foo"])))

(s/exercise :read/file 4 {:read/file gen-file}) 

Yehonathan Sharvit10:02:07

I’m reading (with enthusiasm) "CREATING A SPEC FOR DESTRUCTURING” http://blog.cognitect.com/blog/2017/1/3/spec-destructuring. Everything is pretty clear except one point related to s/merge at the end of the article

Yehonathan Sharvit10:02:27

;; collection of tuple forms
(s/def ::map-bindings
  (s/every (s/or :mb ::map-binding
                 :nsk ::ns-keys
                 :msb (s/tuple #{:as :or :keys :syms :strs} any?)) :into {}))

(s/def ::map-binding-form (s/merge ::map-bindings ::map-special-binding))

Yehonathan Sharvit10:02:13

I don’t understand why inside ::map-bindings, we have the :msb part

Yehonathan Sharvit10:02:45

AFAIU, the :msb part relates only to ::map-special-binding

dergutemoritz12:02:00

@viebel It makes sure that no other keywords than the ones in the set are present

dergutemoritz12:02:31

::map-special-bindings is defined in terms of s/keys which is an open key set by design

dergutemoritz12:02:07

At least that's what I surmise is the motivation. Being the author of that post, @alexmiller might have more insight 🙂

Yehonathan Sharvit12:02:59

That makes sense. But I think it would be clearer if (s/tuple #{:as :or :keys :syms :strs} any?) was part of ::map-special-binding. What do u think @dergutemoritz ?

dergutemoritz12:02:45

Not sure I would find it clearer that way

dergutemoritz12:02:42

The way it is now makes the ::map-special-bindings spec a tad more widely usable I guess

Alex Miller (Clojure team)18:02:08

making ::make-special-bindings reusable is not really a goal

Alex Miller (Clojure team)18:02:38

the tuple check can’t be part of ::map-special-binding b/c that spec is a map spec which is merged with the rest

Alex Miller (Clojure team)18:02:39

I think it’s probably a reasonable criticism that this spec is too restrictive (from a future-looking point of view) in narrowing to just this set of options as we know it now.

Alex Miller (Clojure team)18:02:38

the intention is really to segment the space of possible tuple forms to bindings (which are simple symbols, vectors, or maps), qualified keywords whose names are “keys” or “syms”, or special options. Here I’ve listed a concrete set of options but lost the open-ness of something like keys. That could be addressed by replacing the #{:as …} with something more generic like simple-keyword?.

bbloom19:02:35

that makes some sense to me ^^ another option would be to use conformers to rewrite the :as in to a ::as node, so that it’s value can be properly checked to be a simple-symbol, for example

bbloom19:02:32

that seems like the the way to recover the openness with all the benefits of the global namespaced keys validation

Yehonathan Sharvit19:02:12

@alexmiller I don’t understand why in this case, the open-ness is desirable. I mean if someone passes :wrong-keys (instead of :keys) to a let statement, it should be invalidated...

Alex Miller (Clojure team)19:02:22

@bbloom no desire to use conformers for something like this

Alex Miller (Clojure team)19:02:51

@viebel the same reason openness is desirable elsewhere (see Rich’s keynote from the conj)

Alex Miller (Clojure team)19:02:19

that is, if you specify a constraint, then removing the constraint later is a breaking change, not an additive change

bbloom19:02:34

@viebel just to make up some straw man extension, consider if you wanted to add some metadata to the match

bbloom19:02:57

{:keys [x y] :types {x int}} ;-)

bbloom19:02:10

you don’t want the spec to reject that types

bbloom19:02:35

@alexmiller sorry, i was thinking about :as in vectors - i think conformers make sense there

bbloom19:02:44

for maps, it’s already a map 😛

bbloom19:02:50

the :msb form is interesting tho - why not just use a :req-un for the :as key?

bbloom19:02:12

i must be missing some subtlety

bbloom19:02:43

oooh, you also do that, this is about accumulating extra information in the parse - ok, i think that makes sense.... sorry for thinking aloud here

Alex Miller (Clojure team)19:02:11

this is a particularly tricky instance of what has been called a hybrid map

Alex Miller (Clojure team)19:02:25

which has both a k/v spec + a keys-type spec

Alex Miller (Clojure team)19:02:05

the tricky part being the :<ns>/keys and :<ns>/syms which are like keys but only semi-fixed

bbloom19:02:08

yeah, my experience w/ hybrid maps has been that they are more trouble than they are worth except for very limited syntax use cases

Alex Miller (Clojure team)19:02:19

nonetheless, they exist :)

bbloom19:02:27

i’ve run in to the ::keys bug

bbloom19:02:53

it is not obvious what to do about that 🙂 so i just renamed my thing, heh

Alex Miller (Clojure team)19:02:07

well, I have some ideas about that but haven’t had a chance to put a patch on there

Alex Miller (Clojure team)19:02:16

but basically it needs to not use s/keys

Alex Miller (Clojure team)19:02:39

as that has behavior that we don’t want in this particular case

Alex Miller (Clojure team)19:02:01

namely, matching all keys as specs

bbloom19:02:10

yeah, i think the patch i saw there was like a flag to disable the nice open validation of keys - which seems like the wrong way to go about it. easier to factor out the parts you want and then call the underlying part directly

bbloom19:02:24

ie composition over parameterization

Alex Miller (Clojure team)19:02:28

that patch is dead, not doing that

bbloom19:02:48

i assumed that 🙂

Alex Miller (Clojure team)19:02:57

and I think I rewrote the ticket to remove that as an option

bbloom19:02:08

cool - i have only been following at a distance

hiredman21:02:55

it seems like you can't use spec to spec a map like {::results ... ::type ...} where the value associated with the key ::type determines what spec is used against the value associated with the ::results key

hiredman21:02:16

it seems like the best you could do would be something with multi-spec, with a different ::results spec (registered under a different key) for each possible value of ::type

hiredman21:02:16

I am trying to spec what I have been thinking of as a result set, a map with a structure like {::results [...] ::next ...} where ::next is used for pagination, and the items under ::results will vary with the query that was run. my first thought was to slap a type tag on the result set, and use a multi-spec, but that doesn't work.

bbloom22:02:26

hiredman: we’ve talked about context sensitivity a bunch of times here in the past - you have a number of options

bbloom22:02:35

none of them particularly great

bbloom22:02:13

the simplest one is to just call the spec you want yourself

bbloom22:02:41

well it depends on your use case

hiredman22:02:51

yeah, for mine it is too opaque 🙂

bbloom22:02:03

i mean, it’s just an s/conform call

hiredman22:02:27

I think I will just have a bunch of slightly different specs, remove any kind of polymorphism

bbloom22:02:38

yeah, so that’s the next simplest solution

hiredman22:02:43

for now, or until something better comes up

bbloom22:02:48

one key for each possible type and then use multi-spec

bbloom22:02:01

so you have like ::foo-results and ::bar-results

hiredman22:02:03

I am generating documentation and json schema from these specs

hiredman22:02:41

well, easiest, since consumers consume json, is to differ based on namespace, which just disappears in the json encoding

bbloom22:02:59

heh, well i guess that works 😛

bronsa22:02:12

what about something like

(require '[clojure.spec :as s])

(s/def ::type keyword?)
(s/def ::value any?)

(defmulti t-val identity)
(defmethod t-val :a [_] int?)
(defmethod t-val :b [_] string?)

(s/def ::tagged (s/and (s/keys ::type ::value)
                       #((t-val (::type %)) (::value %))))

(def foo {::type :a ::value 1})
(def bar {::type :b ::value "2"})
(def baz {::type :b ::value 1})

(s/valid? ::tagged foo) ;-> true
(s/valid? ::tagged bar) ;-> true
(s/valid? ::tagged baz) ;-> false

hiredman22:02:59

once you start building using functions instead of the combinators (like s/keys) explain is less useful, and any kind of treatment of specs as data becomes more of pain (like if you are walking them to generate something else)