Fork me on GitHub
#clojure-spec
<
2016-06-14
>
mfikes03:06:26

It’s too bad (s/describe (s/spec #(< % 3))) has to return something that doesn’t quite look like an anonymous function literal

mfikes03:06:18

I’m not sure what to make of the fact that

(s/def ::a (s/and #(> % 2) #(< % 5)))
(s/def ::b #(and (> % 2) (< % 5)))
look the same. Perhaps it is OK because they behave the same way.

jimmy05:06:19

hi guys, how do we use variable inside of (s/def (s/keys)) like this

(def a [::first-name])
(s/def ::user (s/keys :req a))
It's a value in a macro in a macro. And I'm no macro master ...

jimmy05:06:42

I have an error while trying to generate from clojure.spec, I think this would be the try out such-that limitation in clojure.spec implementation.

(defn str-limit [lower upper] (s/and string? #(<= lower (count %) upper)))
(s/def :project/description (us/str-limit 116 1000)) ;; the gen doesn't work if the lower value is around above 100
(s/def ::project-gen (s/keys :req [:project/description]))
(gen/generate (s/gen ::project-gen))

;; -- error
ExceptionInfo Couldn't satisfy such-that predicate after 100 tries.  clojure.core/ex-info (core.clj:4703)

stathissideris07:06:59

@nxqd: This is because string? generates totally random strings and then checks the second predicate to see if they conform, and gives up after 100 tries. Consider wrapping the spec with s/with-gen and providing your own generator.

jimmy09:06:47

@stathissideris: thanks !

skapoor11:06:13

hi guys, am playing with clojure spec and running into an un-expected behavior: (s/valid? #{nil "a" "b"} nil) ;; returns false when it should be true

minimal11:06:55

nil is not truthy so the set as a predicate is returning a non-truthy value

minimal11:06:57

(s/valid? #(contains? #{nil "a"} %) nil)
true

skapoor11:06:47

@minimal yeah, I tried that and it works.. but it should return true when a set is used in specs too right?

minimal11:06:34

If you are relying on the value returned from the set then it needs to be truthy to be valid

skapoor11:06:04

okay, (map #{"a"} ["a" nil]) and (map #{"a" nil} ["a" nil]) both return ("a" nil)

minimal11:06:46

The not-found value of a set is also nil so it is troublesome if you don’t expicitly check using contains?.

skapoor11:06:56

i'm using sets in specs to define an enum of values with the possibility of nil.... so it looked like a convenient way. guess, I'll have to either use s/or nil? ... or #(contains?)

minimal11:06:07

or you can use a defualt value that isn’t nil. (#{} 1) => nil (get #{} 1 :not-found) => :not-found

minimal11:06:02

Yeah it’s tricky

skapoor11:06:50

ok thanks. i'm wondering if I should file an issue for this on jira..

minimal11:06:14

I don’t think it’s an issue

skapoor11:06:33

only because the contains? behavior is right but when a set is used as a function call it's not.

minimal11:06:14

Using set as a predicate is a convenience but you are relying on an implicit conversion to boolean based on the value in the set

gfredericks11:06:58

either choice is surprising

gfredericks11:06:22

depending on whether you expect the set to be treated specially or to be treated like a predicate

skapoor11:06:31

the implicit conversion to boolean is being done by the internal clojure implementation..

skapoor11:06:49

@minimal: but I get it, it won't be considered a defect. so I'll just use a different way. thanks!

alexmiller15:06:14

@mfikes from your question way back on describe - s/form is useful for distinguishing these

alexmiller15:06:23

@skapoor: you could also wrap s/nilable around the set (s/nilable #{"a"})

arohner17:06:52

@alexmiller: what’s the rationale behind instrument only checking :args?

alexmiller17:06:19

shift in perspective - instrument is about checking invocations are correct, not verifying the function itself is correct. (this is similar to how only the :args are checked in macro fdefs). The check and test functions are for verifying functionality of the code.

bfabry18:06:49

is there still a way to set up your app so every spec is checked on every invocation?

arohner18:06:35

@alexmiller: this seems to severely hamper spec’ing non-pure functions (i.e. things that are difficult/impossible to write generators for)

arohner18:06:33

and IMO, validation and generative testing are still too tightly coupled

alexmiller18:06:43

@bfabry: no, but that was not really the intention of instrument, which is about verifying that callers of a function are calling it correctly. If you want to test the the behavior of your functions are in accordance with your specs, you should use the spec.test functions to test your functions.

bfabry18:06:17

@alexmiller: sure, and no doubt were we to use spec we would use the spec.test functions. but like @arohner mentioned not all functions are pure, and if I'm going to write all those specs then I might as well get some extra value from them for free by having them always be checked in lower environments where I do not care about performance

alexmiller18:06:34

@arohner: there are still more things coming that will help with verifying aspects of non-pure functions

arohner18:06:41

@alexmiller: one nice feature of schema is that you can choose to use validation at e.g. user-input boundaries, in production. Instrument seems more designed for dev-time atm

alexmiller18:06:59

@arohner spec does not remove the need to write tests for your functions. those tests can use invoke specs to validate as appropriate

alexmiller18:06:14

@arohner: instrument is only designed for dev time

alexmiller18:06:25

you should not instrument in production

alexmiller18:06:14

you can choose to explicitly conform with spec at boundaries if you like

alexmiller18:06:10

there will be a conform-ex that validates or throws if not, not quite in yet

bfabry19:06:04

if I explicitly conform, then it will happen in production, when I only want it in staging. it also adds a whole bunch of boilerplate to every single function. I don't really understand the reasoning here, being able to reuse the :ret and :fn specs for extra checking when performance isn't a consideration just seems like an obvious win. and I mean, I definitely can still do that, writing my own macro that wraps all functions or whatever, but it sounds like I won't be the only one

alexmiller19:06:26

there is an assertion facility still coming as well

alexmiller19:06:27

the point is that checking ret/fn every time should be redundant to what you have (presumably) already confirmed in testing - that your function works.

bfabry19:06:30

I'm maybe a bit skeptical that adding generative testing (while definitely awesome) is going to straight away mean I stop writing functions that produce unexpected outputs when they encounter production data. and I'm a big fan of fail fast with a good error message when that does happen

alexmiller19:06:44

you (will be) able to assert that return (if instrumented at dev time) or choose to explicitly validate it at production time if you want

alexmiller19:06:16

it's unclear to me if you're talking about dev or prod

bfabry19:06:32

actually talking about master/staging

alexmiller19:06:10

fair enough - so you can turn on instrumentation in staging

alexmiller19:06:36

that will check args on functions

bfabry19:06:48

on my laptop/travis I run unit and generative tests, in master/staging the application runs "production-like" but with assertions turned on for :args :ret :fn, production the app runs with no assertions <-- this is my ideal scenario

alexmiller19:06:05

there will be an assertion facility that you can use to (explictly) check ret/fn for instrumented functions

danstone19:06:55

As the channel is a little bit louder this evening, I thought I'd pose a question I asked yesterday again: What is the intended usage of :fn in fdef - In the docs it says something like 'relationships between args and the return'. Is the idea here to restrict it to type-y properties (e.g arity 1 of map returning a transducer rather than a seq)? The reason I ask is it's possible to define many more general properties of functions as part of the spec. As spec gets richer I imagine it may be possible to auto-generate the code for many properties (idempotency is easy if you have spec'd the args)

alexmiller19:06:43

I think what I'm talking about is close to that, but varies in that ret/fn are not automatically checked but require an explicit assert (which is not active in production)

bfabry19:06:05

right. and so I'll probably end up writing a macro that wraps every function to add that explicit assert, and I've got a feeling that a whole lot of people will do that. we use plumatic/schema atm which checks arg/return values when validation is turned on, and I'd say the same number of bugs are caught by the return value checking as the arg value checking. aaaanyway, writing the macro is nbd, and maybe it'll turn out I don't actually need it or I'm the only person who does 🙂

alexmiller19:06:04

one question I have is whether you're getting the same level of generative testing from schema that you can get from spec (that is, whether more of those bugs could/should have been caught earlier)

alexmiller19:06:52

also keep in mind that the return value often is an input to another function, which can check its args

bfabry19:06:40

no, we're definitely not. but like I said I'm just a bit skeptical that generative testing will suddenly mean these issues disappear, and it costs me like an extra $5 per month to run extra validation in the staging environment so why not? the return value will likely be the input value to another function, but maybe that function doesn't have a spec yet, because we didn't feel it was worth the time to write yet, or maybe it's too broadly defined etc. If I believed I could perfectly specify every function up front so that bugs were impossible I'd be writing haskell 😆

alexmiller19:06:16

maybe you should just write your code without the bugs?

wilkerlucio20:06:26

hey people, does anyone here found/created a generator for cljs to create strings from regexps? I was hoping to use the string-from-regexp from test.chuck but I just realised it's implementation is for CLJ only

alexmiller20:06:20

I would ask @gfredericks

wilkerlucio21:06:02

thanks Alex, I opened an issue on test.chuck, he will see it I think 🙂

wilkerlucio21:06:56

one more thing, given I have my fdef all spec set, I remember seeing a simple command to test that function but I'm not finding it, what's the simplest way to run generative specs on a function?

alexmiller21:06:06

clojure.spec.test/check-var

wilkerlucio21:06:14

@alexmiller: thanks, would you consider adding that to the clojure spec guide?

alexmiller21:06:35

yeah, that's on the todo list - I was actually working on some alpha6 related updates right now

ikitommi21:06:31

If I have understood correctly, the registry doesn’t have any tools for handling duplicate definitions?

ikitommi21:06:17

so, if there are multiple definitions fos person/id, the last one stands?

ikitommi22:06:39

Is this good? should the specs be immutable by default? If there are name clashes, depending on the import order of namespaces, the specs might mean different thing.

ikitommi22:06:15

or some hooks for the registry to resolve those clashes.

wilkerlucio22:06:58

@ikitommi: this is the reason they are encouraging namespaced keywords I think, do you have a situation where the same namespace is loaded on multiple files?

bfabry22:06:03

functionally the same as multiple (def's isn't it?

bsima22:06:59

does anyone have examples of using fdef and check-var? I'm having trouble getting it to work in my test suite

ikitommi22:06:28

@wilkerlucio: true that - the specs must have a namespace, but it doesn’t have to be a clojure namespace. One might have multiple :order/ids in a large system. Should not, but could.

wilkerlucio22:06:31

@bsima: check this snippet, may help you:

wilkerlucio22:06:47

@ikitommi: that's some sort of thinking shift, moving from those to fully qualified, there are going to be some nice helpers to deal with longer namespaces in 1.9

wilkerlucio22:06:50

yes, the problem exists like you said, but it's the same for def as mentioned by @bfabry, I guess it's just about people starting moving towards fully qualified namespaces to avoid name clashes, I believe when it's the norm will be very positive for everyone