This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2017-02-06
Channels
- # aleph (2)
- # aws (3)
- # bangalore-clj (3)
- # beginners (119)
- # boot (263)
- # cider (13)
- # cljs-dev (16)
- # clojars (2)
- # clojure (114)
- # clojure-austin (1)
- # clojure-chicago (1)
- # clojure-finland (1)
- # clojure-france (24)
- # clojure-italy (6)
- # clojure-russia (28)
- # clojure-serbia (7)
- # clojure-spain (1)
- # clojure-spec (89)
- # clojure-uk (139)
- # clojurescript (216)
- # community-development (3)
- # core-async (135)
- # css (2)
- # cursive (31)
- # datomic (44)
- # emacs (15)
- # hoplon (2)
- # jobs (3)
- # lein-figwheel (14)
- # leiningen (2)
- # lumo (21)
- # off-topic (16)
- # om (7)
- # om-next (1)
- # onyx (53)
- # perun (9)
- # planck (15)
- # portland-or (29)
- # protorepl (2)
- # re-frame (32)
- # reagent (8)
- # ring-swagger (22)
- # rum (51)
- # spacemacs (4)
- # untangled (2)
I’m trying to understand the difference between s/coll-of
and s/every
.
> Note that 'every' does not do exhaustive checking, rather it samples coll-check-limit elements. Nor (as a result) does it do any conforming of elements. 'explain' will report at most coll-error-limit problems. Thus 'every' should be suitable for potentially large collections.
What does it mean that every
doesn’t do any conforming of elements?
(s/conform (s/every number?) (range 1000))
seems to do the conforming pretty well!
@viebel Try with a spec that conforms to a different value
Like s/or
Like this:
(s/conform (s/every (s/or :n number?
:s string?))
(concat (range 1000)))
Yeah that should work
If I understand correctly s/every
leaves the data as-is
while s/coll-of
conforms it
It will only check whether the values match the spec
I cannot find the correct way to call clojure.spec.test/check
on my function that might read a file `
(s/def :read/file #(instance? File %))
(s/def :read/content string?)
(s/def :read/fail nil?)
(defn read-file [^File f] (when (.isFile f) (slurp f)))
(s/fdef read-file
:args (s/cat :f :read/file)
:ret (s/or :read/content :read/fail)
:fn (fn [{{f :f} :args ret :ret}]
(if (.isFile f)
(s/valid? :read/content ret)
(s/valid? :read/fail ret))))
(defn gen-file []
(gen/elements (map (fn [^String f] (File. f))
["./project.clj" "foo"])))
(s/exercise :read/file 4 {:read/file gen-file})
I’m reading (with enthusiasm) "CREATING A SPEC FOR DESTRUCTURING” http://blog.cognitect.com/blog/2017/1/3/spec-destructuring. Everything is pretty clear except one point related to s/merge
at the end of the article
;; collection of tuple forms
(s/def ::map-bindings
(s/every (s/or :mb ::map-binding
:nsk ::ns-keys
:msb (s/tuple #{:as :or :keys :syms :strs} any?)) :into {}))
(s/def ::map-binding-form (s/merge ::map-bindings ::map-special-binding))
I don’t understand why inside ::map-bindings
, we have the :msb
part
AFAIU, the :msb
part relates only to ::map-special-binding
any idea @dergutemoritz or @alexmiller ?
@viebel It makes sure that no other keywords than the ones in the set are present
::map-special-bindings
is defined in terms of s/keys
which is an open key set by design
At least that's what I surmise is the motivation. Being the author of that post, @alexmiller might have more insight 🙂
That makes sense. But I think it would be clearer if (s/tuple #{:as :or :keys :syms :strs} any?)
was part of ::map-special-binding
. What do u think @dergutemoritz ?
Not sure I would find it clearer that way
The way it is now makes the ::map-special-bindings
spec a tad more widely usable I guess
ok. makes sense
making ::make-special-bindings
reusable is not really a goal
the tuple check can’t be part of ::map-special-binding
b/c that spec is a map spec which is merged with the rest
I think it’s probably a reasonable criticism that this spec is too restrictive (from a future-looking point of view) in narrowing to just this set of options as we know it now.
the intention is really to segment the space of possible tuple forms to bindings (which are simple symbols, vectors, or maps), qualified keywords whose names are “keys” or “syms”, or special options. Here I’ve listed a concrete set of options but lost the open-ness of something like keys
. That could be addressed by replacing the #{:as …}
with something more generic like simple-keyword?
.
that makes some sense to me ^^ another option would be to use conformers to rewrite the :as in to a ::as node, so that it’s value can be properly checked to be a simple-symbol, for example
that seems like the the way to recover the openness with all the benefits of the global namespaced keys validation
@alexmiller I don’t understand why in this case, the open-ness is desirable. I mean if someone passes :wrong-keys
(instead of :keys
) to a let
statement, it should be invalidated...
@bbloom no desire to use conformers for something like this
@viebel the same reason openness is desirable elsewhere (see Rich’s keynote from the conj)
that is, if you specify a constraint, then removing the constraint later is a breaking change, not an additive change
@viebel just to make up some straw man extension, consider if you wanted to add some metadata to the match
@alexmiller sorry, i was thinking about :as in vectors - i think conformers make sense there
oooh, you also do that, this is about accumulating extra information in the parse - ok, i think that makes sense.... sorry for thinking aloud here
this is a particularly tricky instance of what has been called a hybrid map
which has both a k/v spec + a keys-type spec
the tricky part being the :<ns>/keys and :<ns>/syms which are like keys but only semi-fixed
yeah, my experience w/ hybrid maps has been that they are more trouble than they are worth except for very limited syntax use cases
nonetheless, they exist :)
well, I have some ideas about that but haven’t had a chance to put a patch on there
but basically it needs to not use s/keys
as that has behavior that we don’t want in this particular case
namely, matching all keys as specs
yeah, i think the patch i saw there was like a flag to disable the nice open validation of keys - which seems like the wrong way to go about it. easier to factor out the parts you want and then call the underlying part directly
that patch is dead, not doing that
and I think I rewrote the ticket to remove that as an option
it seems like you can't use spec to spec a map like {::results ... ::type ...}
where the value associated with the key ::type
determines what spec is used against the value associated with the ::results
key
it seems like the best you could do would be something with multi-spec, with a different ::results
spec (registered under a different key) for each possible value of ::type
I am trying to spec what I have been thinking of as a result set, a map with a structure like {::results [...] ::next ...}
where ::next
is used for pagination, and the items under ::results will vary with the query that was run. my first thought was to slap a type tag on the result set, and use a multi-spec, but that doesn't work.
hiredman: we’ve talked about context sensitivity a bunch of times here in the past - you have a number of options
I think I will just have a bunch of slightly different specs, remove any kind of polymorphism
well, easiest, since consumers consume json, is to differ based on namespace, which just disappears in the json encoding
what about something like
(require '[clojure.spec :as s])
(s/def ::type keyword?)
(s/def ::value any?)
(defmulti t-val identity)
(defmethod t-val :a [_] int?)
(defmethod t-val :b [_] string?)
(s/def ::tagged (s/and (s/keys ::type ::value)
#((t-val (::type %)) (::value %))))
(def foo {::type :a ::value 1})
(def bar {::type :b ::value "2"})
(def baz {::type :b ::value 1})
(s/valid? ::tagged foo) ;-> true
(s/valid? ::tagged bar) ;-> true
(s/valid? ::tagged baz) ;-> false