Fork me on GitHub

How are you guys spec'ing recursive functions where intermediate return values have a different spec than the final return value? Are you just adding an or to the :ret or are you writing your function in such a way that that isn't possible (e.g. bundling the recursion in the function by creating an anonymous fn inside your function)?


@kenny: Can you give an example of such a scenario?


(I generally try to avoid functions that can return multiple types although I’ll often have functions that return something-or-nil)


(defn new-matrix-nd
  "Returns a new general matrix of the given shape. Shape can be any sequence of
  integer dimension sizes (including 0 dimensions)."
  (if-let [dims (seq dims)]
    (vec (repeat (first dims) (new-matrix-nd (next dims))))


(s/fdef new-matrix-nd
        :args (s/cat :dims (s/nilable coll?))
        ;; ret is not a matrix because new-matrix-nd is a recursive fn
        :ret (s/or :m vector? :n number?))


Ideally the spec should be

(s/fdef new-matrix-nd
        :args (s/cat :dims (s/nilable coll?))
        :ret matrix?)


so (new-matrix-nd []) produces 0.0?


But the final return value is always a matrix


Unless you accidentally call it with nil or an empty sequence of dimensions...


So a zero-dimensional "matrix" is a scalar… hmm… that’s an interesting one...


I can wrap up the recursion inside the function


That would probably be cleaner to the outside world


I’m trying to think how specs are going to be much use to you here tho’… since any call site is going to get "vector or scalar" as a result...


Hmm.. Yeah it might not make sense.


even your :args spec doesn’t buy you much here: nil or an arbitrary collection — you probably want (s/coll-of integer? []) to give you more checking — and I’d be tempted to disallow nil and require an empty sequence of dimensions… unless you have a really good reason for allowing nil there?


Maybe the Clojure/core folks need to give a bit more guidance in the Rationale as to how they expect spec to be used in the real world? /cc @alexmiller


I’d imagined it as a set of specifications for some pretty high-level parts of your code, around your domain objects, or possibly as the definition of a library API… but I’m not sure about the latter yet...


nil is just the terminating value. It seems wrapping the recursion inside the fn may solve this.


At work we’re looking at spec to write a specification of our domain model and some of the high-level business logic that operates on that. I don’t know how far "down" the call tree we’ll go...


The Rationale says "the supply chain isn’t burdened with correctness proof. Instead we check at the edges and run tests" … "Invariably, people will try to use a specification system to detail implementation decisions, but they do so to their detriment."


This cleans up the spec...

(defn new-matrix-nd
  "Returns a new general matrix of the given shape. Shape can be any sequence of
  integer dimension sizes (including 0 dimensions)."
  (letfn [(new-matrix-nd' [dims]
            (if-let [dims (seq dims)]
              (vec (repeat (first dims) (new-matrix-nd' (next dims))))
    (new-matrix-nd' dims)))


(s/fdef new-matrix-nd
        :args (s/cat :dims (s/coll-of integer? []))
        :ret (s/or :m matrix? :s scalar?))


But matrix? still allows a scalar, right?


Otherwise (new-matrix-nd []) will pass the :args check but fail the :ret spec.


Edited to fix...


Your refactoring hasn’t changed the spec tho’...


No nil allowed in the fn


Well, the spec disallows it, but that was true of the earlier function as well.


You could pass nil to the previous fn. That should not be allowed.


Though that spec actually isn't working. (s/valid? (s/coll-of integer? []) nil) => true


(then I was wrong, s/coll-of allows nil?)


I guess. That doesn't seem right though


(s/valid? (s/* integer?) nil) => true


boot.user=> (s/conform (s/* integer?) nil)
boot.user=> (s/conform (s/coll-of integer? []) nil)


boot.user=> (s/conform (s/* integer?) (list 1 2 3))
[1 2 3]
boot.user=> (s/conform (s/coll-of integer? []) (list 1 2 3))
(1 2 3)


So coll-of is the spec for a (possibly nil) collection — sequence of values — and * is a regex to match zero or more items and returns a vector.


Was coll-of designed such that a possibly nil collection of values is valid?


Not sure how that can be an artifact as nil is inherently false so it would seem to be by design?


boot.user=> (source s/coll-checker)
(defn coll-checker
  "returns a predicate function that checks *coll-check-limit* items in a collection with pred"
  (let [check? #(valid? pred %)]
    (fn [coll]
      (c/or (nil? coll)
             (coll? coll)
             (every? check? (take *coll-check-limit* coll)))))))
^ this is what coll-of calls and it has a very specific check for nil


Hmm. Wonder why


Because nil-punning is idiomatic and encouraged?


I thought the idea was to be explicit about nil with nilable


(that’s a question for Clojure/core really but…)


Maybe we can get some input tomorrow on this.


I suppose it does make sense coming from the Java static typing world because you could pass null in place of any argument. We don't need to adhere to that though.


But any function that accepts a (general) collection is almost certainly going to accept nil I’d say…?


Why make any assumptions?


As noted above/elsewhere, clojure.spec is deliberately opinionated 🙂


(not saying I agree, just observing)


Very different idioms. If null isn’t treated as falsey, then you have a very different style of programming.


For example, in some languages you’d have a Maybe monad and you would chain operations using monadic functions. Scala has Option[T] and flatMap, as I recall. Haskell has Maybe t and fmap (yes?). But in Clojure you’d chain such operations using some-> or some->> and they’d just be regular functions.


Or you use if-let / when-let


I was chatting with someone the other day and saying that I almost never hit a NPE in Clojure these days. About the only time I still do is in Java interop code (usually calling into the String class for something!).


True (though we can follow that style of programming is Clojure e.g. cats). Anyways, I suppose that is a tangent and doesn't really pertain to this situation.


Yeah you're totally right. I can't recall the last NPE I got.


(the irony of this conversation today is that someone reported an error against java.jdbc that a particular erroneous call to one of the functions produces a NPE — for invalid arguments)


The latest version of java.jdbc accepts those calls without error tho’...


Anyways, the primary reason I don't want nil to exist in this particular library is there is no such idea in the math world.


My feelings about clojure.spec are pretty similar to yours @seancorfield... most for library public APIs (think pedestal, with namespaces keywords and data-based DSLs) or for general domain definition. Following the guidelines presented on Clojure Applied, I imagine using specs at domain edges and possibly Records only as a implementation detail of that domain (as suggested in the book).


I still need more though (or some good rule of thumbs) on the usage of namespaced keywords per se as a design tool. When to use a fully qualified - and by definition more coupled to implementation - vs just simple namespaces aka :foo/bar. And when it will be fine to use unqualified ones as envisioned by the Clojure/core style


Records require non-namespaced keys tho’ so I wonder how that sits with specs? (and I don’t think clojure.spec actually supports records?)


Yeah, as far as I got it does not. Considering this style of "Records as an implementation detail", that makes sense no? I just following through the documentation snippet you shared


I’m interested to know how spec handles records, too.


I guess you treat the fields as non-namespaced keys?


I'm considering spec from an Information Model point of view. Let's say I wanted to create a spec for car. It might have keys like colour and year. That part is easy. But what about manufacturer where I don't want containment, and instead I want a reference - an id? I could say the spec for manufacturer was integer?, but how should I be more specific and say it is a reference to another spec. My mind is wandering forward and imagining a tool which can read the central registry and produce a nice SVG Information Model. But it can't do that unless it knows that the manufacturer in car is more than an integer? ... it is actually a reference to another entity.


@mikethompson: Sure you can have ::manufacturer-id that specifies actual valid manufacturer IDs (or whatever you needed) and for testing you could specify a generator that produced (a small subset of) manufacturer IDs.


> I guess I would have to question your desire to define an API based on maps with non-keyword keys? @seancorfield: Oh, believe me: If we were in control of this data we would use keyword keys. We're trying to enforce / check our assumptions about data we don't control.


@seancorfield: Yep, understand that. I'm just wondering about how that SVG Information Model Diagram tool could work. By looking in the central registry, how does it know that ::manufacturer-id is a reference to ::manufacturer. I can't see how that can be captured.


Almost seems like metadata on ::manufacturer-id


@mikethompson: maybe :manufacturer/id and :manufacturer/spec?


Conventions are needed somewhere, somehow.


Ahh. Metadata via naming convention 🙂


@zane: If the keys are strings when you get them from "the outside" then maybe keywordize them for use "inside"?


Anyway, all good. I was just wondering if I was missing something. Thanks


@mikethompson: No idea what Clojure/core might recommend here. I was trying to follow the Datomic lead there but I haven't looked at how referential specifications might work yet... Definitely something we'll run into at work as we move forward with this...


@zane: If you keywordize them "as-is" (maybe putting a standard namespace qualification on them too?), then you can validate with the full force of spec? At work, we're considering using a naming strategy with java.jdbc to create namespaced keys from the DB rather than just column names, so we can move straight to spec. Not sure how that will work out yet but we're planning a spike along those lines next week.


I'm still a bit new and wanting to bring in some typing. I'm not familiar with the respective libs, is spec effectively replacing schema and typed.clojure?


Hi @josh.freckleton -- have you read the Rationale for spec on the site?


(it tries to clarify the spec vs type system position)


@seancorfield: I have, I'm just trying to clear things up a bit, and I guess i'm wondering where I should invest my time 🙂


spec is different to both core.typed and Schema...


@seancorfield: oh, can I ask another question while we're talking about this? in schema or spec, could I have a case (switch) according to types, or defmethods on different types? Would there be a prefered method?


core.typed is intended to provide a type checking system based on static analysis of the code with annotations. Schema is a runtime validation system.


Not sure what you're asking... I think I'd have to see an example of what you're trying to do...?


(ahh, your typed vs. schema example makes sense. i still need to digest this) So say I have a few different "types", and I want to map a fn over them which is specific to their type, I've just started to learn about defmethods, and of course there are switch statements (`case`) switching on the type of the object under consideration. If I, say, had a list of things that could be all different types, what would be the best way in to switch on their types?


Sorry, I still don't understand...


my fault, I'll be more specific...


I'm working with the free monad + interpreter pattern trying to build some code that takes a data model, and can interpret it in various ways. The important part is that the data model, essentially the Free Monad instance, is a recursive "type" where each "node" of this data model can be different types. Currently, I signify their "type" as a key :type in a mapping, and I (case (:type obj... to decide what to do with each "node". For example, one node can be a "User", and another can be "Owned Books", and another can be "Email", "Password", etc. As I interpret this data model, I will want to treat Users differently from Emails, and Passwords, etc. The 3 options I know of are to 1. defmethod according to different, real types, 2. to case over their type if spec or schema allows me to switch on types, or 3. what I have now with a map, and a "`:type`" key. Maybe I'm completely misguided in how I'm trying to solve this though, and I'm completely open to you annihilating my idea if it's far out-field! Does that make more sense?


So you want polymorphic dispatch on the :type key in the hash map?


defmulti / defmethod would work for that. Or use a protocol with multiple defrecord implementations.


(the latter would be actual types and would no longer need the :type key)


defmulti would use a dynamic dispatch based on aspects of the data structure, in your case the :type key.


So, basically, defmulti is a sort of case statement dispatch 🙂


Ok, I'm glad that way would work, thx. And would their be an idiomatic way to do it without declaring polymorphic methods, like: (case (spec/type obj) :a "a" :b "b"), or case (schema/type obj) :a "a" :b "b") I haven't seen anything in the docs that fits that ability to check the type of an obj, but I maybe I've missed it?


Yeah, I think you're misunderstanding what spec and Schema are about.


Why wouldn't you just do (case (:type obj) :a "a" :b "b") at that point? The :type key gives you what you need.


Or (defmulti foo :type) (defmethod foo :a [obj] "a") (defmethod foo :b [obj] "b")


That's probably why I couldn't ask it clearly 😒 I was just curious if that was an option, it seems similarish to me of Haskell's matching on different types. I think that's the "feel" I was going for, but again I could be wrong since I'm new the functional world.


for example:

f Nothing = foo
f (Just x) = bar
You've been a big help sean, helping me prune my search tree of what I need to study, haha. Thank you so much, and thanks for staying up late to help us noobs out!


In Clojure, Maybe would more likely just be nil or not-`nil` and you wouldn't pattern match, just (defn f [x] (if x 'bar 'foo))


oh sure, and there's also a maybe monad, I'm thinking for matching on many (10-100) custom types.


Right, so if you want polymorphic dispatch on a single argument (type), you probably want a protocol and defrecord.


If you want more ad hoc polymorphism, defmulti is probably your tool


Clojure is very different from things like Haskell, since there's no extant type system, even tho' Clojure is strongly typed (at runtime).


So the idioms are very different.


Clojure's polymorphism is very different from OOP languages as well.


K, I had been considering these different options, and with your suggestion I'll zoom in on defrecord/`defmulti`, that helps me cut out tomorrows work 🙂


@mikethompson: @seancorfield was thinking also for doing registry visualization to learn things. Did a schema->graphviz tools some time ago, which helps a lot. Are you already working on this?


@ikitommi: thought experiment for me. So, no, not working on it.


I'm not a very visual person so it's a "no" on my end too.


I'm having difficulty spec-ing a map. Would like to re-use spec for the keys and values. In particular, I'm trying to spec this bit of Datomic Pull: map-spec = { ((attr-name | limit-expr) (pattern | recursion-limit))+ }. I've already got specs for ::attr-name, ::limit-expr, ::pattern and ::recursion-limit, but I can't figure out how to re-use them in the spec for ::map-spec


I thought s/map-of would be the ticket, but it works with predicates not specs.


@nwjsmith: See the Entity Maps section of the guide


you want s/keys basically


AFAICT s/keys is too strict, e.g I can't see how to specify that all keys conform to (or :attr-name ::attr-name :limit-expr ::limit-expr). It will only let me specify particular keys.


I think what I'm looking for is something half-way between s/keys and s/map-of.


@seancorfield: Those suggestions are great. Thanks! Full support for maps with keys that aren't keywords would still make sense to me, though. There are plenty of domains where what you're modeling with data really does require keys of other types.


@zane: I suspect maps with "other types" of keys are more likely to be homogeneous since you wouldn’t be able to enumerate all the keys (i.e., to list which are required and which are optional).


We have some maps from strings to hash maps, and some from longs to strings etc. But those don’t have specific required / optional keys — they’re "lookup tables".


(it’s an interesting area of discussion tho’)


That makes sense, and matches my experience.


@zane reading through your conversation w/ @seancorfield it looks like you and I have the same issue. Have you looked into implementing clojure.spec/Spec?


@nwjsmith: Only superficially.


@seancorfield: I'm realizing that one problem with the keywordize-before-running-specs approach is that any resulting error messages will refer to the keyword names rather than the string ones. That limits their utility significantly (makes them less useful to the calling client, for example).


Maybe validating JSON data via spec just isn't an intended use case?


How are you getting the JSON data? When we get JSON from a web service etc, we have clj-http keywordize the result, and we do the same with Cheshire as well. Basically, if we ingest JSON in any manner, we always convert to keywords at the boundary and validate afterward.


(Aside: I'm learning a lot from your responses! Thanks!)


@seancorfield: Right, right. But our intent was to then return any error messages explain-data produces from speccing the request to the calling client.


explain-data's output is going to refer to the keywordized request, which could be confusing to the client if they're, say, unfamiliar with Clojure.


Hmm, I’m not convinced explain-data is a good format for clients since it refers to the internals of your specs … and what about localization of messages, and error codes for lookup in a reference manual for your API?


Hmm. I suppose I didn't notice anything about specs that made me think they were fundamentally internally-facing. I'm curious what your thinking is there.


I had imagined that error codes could be generated via a transformation of the explain-data output. (Localization isn't necessary in this particular case since this is API is only used internally.)


If you’re going to transform the explain-data output, wouldn’t you then want to convert the keywords (including the spec names) to strings anyway?


Taking this example from the Guide:

(s/explain-data ::name-or-id :foo)
;;=> {:clojure.spec/problems
;;    {[:name] {:pred string?, :val :foo, :via []},
;;     [:id] {:pred integer?, :val :foo, :via []}}}


You’d want to convert :name and :id and whatever was in :via since those are keywords labeling parts of specs and paths through things — so why not convert the actual keys as well?


(having said that, we haven’t gotten as far as trying this sort of thing to generate responses to clients — we currently have custom validation… and that validation checks the types of values passed in as well as the structure so I’m not sure how much we could delegate to spec… but it’s an interesting option)


Yes, you could totally convert the keywords back to strings, but you wouldn't have to if spec had better support for non-keyword keys in the first place. That's my whole point.


But this is totally something that could be added via an extension library, so I guess I should roll up my sleeves. ☺️


(BTW, Alex Miller just replied on clojure-dev that he’s on vacation this week which is why he’s not participating here right now — expect more feedback next week!)


Is it possible to dynamically generate specs from some sort of schema? I have a rest api which exposes a schema like this /contacts/schema (json)

{ "type": "Struct",
  "keys": {
    "first_name": {"type": "String"},
    "last_name": {"type": "String"},
    "is_organization": {"type": "Boolean"},
    "updated_at": {"type": "DateTime", "options": {"format": "%FT%TZ"}},
    "emails": {
      "type": "List",
      "schema": {
        "type": "Struct",
        "keys": {
          "label": {"type": "String"},
          "email": {"type": "String"}}}}}}
From reading the docs, it wasn't clear if I could generate dynamic specs from the above schema.


@iwankaramazow: Given that clojure.specs API is pretty much all macros, my initial reaction would be "No". I would expect you could generate a Clojure source file from your JSON and use that to define the specs, however.


I’m not sure how much of the underlying spec representation is publicly exposed to support dynamic generation of specs...


@seancorfield: Thanks for the response! Feared this, I'll experiment a bit


I’ll take the contrary stance


it is possible to build a spec from that, but it requires building some new ‘primitives'


it is possible to build a fn that validates a map containing strings, and it is possible to build a test.check generator for for that schema


it’ll take a bit of work, but it’s doable


@seancorfield: spec’s two primitives are predicates and test.check generators. If you can build a predicate for it, you can use it in spec. If you can build a test.check generator, you can use it to generate


s/keys is big because it generates a bunch of predicates, one for map? and then one for each key/value. there’s nothing special about it


@arohner: sounds logical, I'll brew me some algorithm. Thanks for the input


@arohner: What about dynamically (programmatically) registering those freshly built specs? Is enough of the machinery exposed for that?


(and, yes, I know it’s all feasible since even private functions are accessible and spec is all built in Clojure — but my "No" was meant to indicate whether it is a realistic goal to attempt and I still maintain the answer is negative there)


Sure, we have the tools to rewrite spec. I didn’t mean to go that far


My point is just that 1) any predicate is usable in spec 2) s/keys uses no special machinery 3) you can build a predicate to validate maps of strings 4) you can build a test.check generator to generate maps of strings


s/keys is the way it is because Rich is being opinionated, not that other map predicates shouldn’t exist


Some of the machinery you’d need is private in clojure.spec (e.g., res) and some of the macros that you’d need to replicate as functions are a lot of code. So, yeah, "it’ll take a bit of work" is certainly true.


ok, you’re right that you’d lose the per-field error messages


because currently, a predicate would look like (defn struct? [m] (and (map? m) (contains? m “type”) …)


(s/conform (s/and map? (fn [m] (contains? m "type"))) {"type" "struct”})


s/keys just builds up a set of #(contains? % <key>) predicates. So you just (s/and map? <chain-of-key-checking-preds>)


hrm, so that gets you key checking, but doesn’t conform values yet


you could use more ands to check values, but ideally we could reuse the pass-specs-to-check-values thing


☝️:skin-tone-2: the sign of a Clojure developer with too much time on his hands… 😆 It’s easy to get drawn into the "Hmm, that’s an interesting problem!" rabbit-hole!


Which is preferred?

(s/conform (s/cat :v (s/spec (s/* number?))) [[1 2 3]])
=> {:v [1 2 3]}
(s/conform (s/cat :v (s/coll-of number? [])) [[1 2 3]])
=> {:v [1 2 3]}
I assume the latter as it feels cleaner.


@kenny: As far as I understand, the former will allow any seq, while the latter will only allow vectors. So depends on what you want.


@mario: Nope:

(s/valid? (s/cat :v (s/coll-of number? [])) '((1 2 3))) => true


The vec at the end is just the value s/conform conforms to.


I see. Then I don’t know 🙂


@kenny: why are you using the s/cat there? Do you really want to spec that it's a collection with only one other nested collection (of numbers)?


is there any way to say that I want 2 optional elements at the end of a cat but they have to be both present or not at all?

(s/conform (s/cat :e1 string? :e2 (s/? keyword?) :e3 (s/? keyword?)) ["foo"])


a bit like that, but is e2 is there, e3 should be there as well


@kenny @mario That second argument to s/coll-of is only used when working with the generator for that spec, I believe. This succeeds but probably isn't a good idea: (s/conform (s/? (s/coll-of number? '())) [[1 2 3]])


@stathissideris: think you want (s/cat :s string? :opt (s/? (s/cat :k1 keyword? :k2 keyword?)))


@jebberjeb: that looks like it could work, let me try!


works, many thanks 🙂


stathissideris: use s/& around the cat to pass in a pred


@arohner: thanks! that's an option too, but I think I'm happy with @jebberjeb 's solution


ah yes, I missed the ‘at the end’ part, so you can use an optional cat


in general, & is used to add extra constraints to a seq


What is the best way to spec multiple arity functions where the args in one overload aren't necessarily related to the args in other overloads. Example:

(defn triangular-matrix
  "Returns a triangular matrix created from `coll`. `coll` is a 1D vector where each element will be used to create the
  triangular matrix. `upper?` is set to true to create an upper triangular matrix, false for a lower triangular matrix.
  `diagonal` is a 1D vector of elements on the diagonal. `off-diagonal` is a 1D vector of upper or lower matrix elements."
  ([coll upper?]
  ([diagonal off-diagonal upper?]


I think that’s: (s/alt (s/cat :coll (s/coll-of …) :upper? boolean?) (s/cat :diagonal (s/coll-of …) :off-diagonal ….))


Yeah that makes sense to me


Why is the common pattern to use cat instead of tuple for spec'ing args?


spec looks amazing so far, the only thing missing is coercion


this is the only point where I think schema is a bit more useful


but still, spec is much more expressive


@arohner: The only thing slightly annoying about that method is needed to specify keys for alt. The keys in this case are pretty arbitrary.


I have a map that looks like this, but if the key is say :x I have a special spec for that value only: (s/map-of (s/or :k keyword? :k string?) ::s/any)


how would I express that?


custom predicate?


validation works with a custom predicate:

(s/def ::simple-attr-map
  (s/map-of (s/or :k keyword? :k string?) ::s/any))

(s/def ::attr-map
  (fn [{:keys [transform] :as m}]
    (and (s/valid? ::simple-attr-map (dissoc m :transform))
         (or (not transform)
             (s/valid? ::transform transform)))))