Fork me on GitHub
#clojure-spec
<
2017-07-11
>
danielcompton01:07:02

I'm trying to use spec.test/check to check all my functions, but I'm getting a couldn't satisfy such-that predicate exception. Is there any way to figure out which spec wasn't able to be satisfied?

seancorfield02:07:29

I don't know of any way to debug that -- other than to s/exercise your specs as you write them...

seancorfield02:07:02

You could call s/registry and go through all the keys in that (which are specs) I guess...

seancorfield02:07:27

...I've just gotten kinda paranoid about exercising each spec in the REPL as I write it.

bfabry02:07:18

I think there's a request open to include the spec in that message. iterating the registry sounds like a good option in the meantime

danielcompton07:07:30

@bfabry do you know which issue it is? I took a look in JIRA but couldn't see anything

hkjels08:07:03

How do I express, many strings and/or many vectors? And I don’t care which one it is, so I don’t need any branching

rickmoynihan09:07:16

@danielcompton: I’ve had that same issue…. Supposedly the latest version of test.check exposes more information on that, infact I updated my test.check to the latest version which provides a bit more info, but still doesn’t tell you the spec that failed (as it knows nothing of spec).

rickmoynihan09:07:30

finding them is a bit of a PITA, lucky for you bisecting specs with exercise is logarithmic 🙂

rickmoynihan09:07:56

but better to exercise them as you write them bottom up like seancorfield suggests

gfredericks11:07:13

@danielcompton @rickmoynihan @bfabry what's needed is to find the spot[s] where clojure.spec calls gen/such-that and get it to pass that debugging info along; I have no idea how difficult that would be, might be super easy

gfredericks11:07:48

the test.check change is that you can now pass an option (`:ex-fn`) that lets you customize the exception

rickmoynihan15:07:08

Some of my spec generators seem to grow to pretty big values under the default 1000 generations used by clojure.spec.test.alpha/check. I don’t know how big they’re getting in numbers of items, but they’re OOMing. Is there an easy way to make specs flatten out at a certain size, but keep iterating over random generations?

rickmoynihan15:07:01

obvs I could add an extra constraint e.g. #(> 1000 (count %)) to maps etc… but it feels like it’d be better done in the map/sequence generators

gfredericks15:07:01

@rickmoynihan under default configurations your tests should max out their size at 200 trials and cycle back to small at that point; there should be a way to restrict the size further than that

rickmoynihan15:07:54

hmmm… not sure why I’m seeing so much heap then

gfredericks15:07:20

the underlying problem here relates to collection sizing, which we have an open test.check issue for

gfredericks15:07:34

but the workaround is to explicitly limit sizing via the option I just pointed to

rickmoynihan15:07:25

yeah you can pass those through via (check ,,, {:clojure.spec.test.check {:max-size 10}})

gfredericks15:07:29

All I was saying with 200 vs 1000 is that you should probably see the same issues with only 200 trials

gfredericks15:07:33

I'd try 50 for starters

rickmoynihan15:07:01

100 for num-tests seems to work but more than that GC seems to dominate the run time

rickmoynihan15:07:17

will try reducing max-size

gfredericks15:07:12

num-tests can be as high as you want as long as max-size is low enough

gfredericks15:07:37

max-size defaults to 200, and the sizes used throughout the run are (cycle (range max-size))

gfredericks15:07:11

so num-tests can "fix" sizing issues only when num-tests < max-size

rickmoynihan15:07:02

cool. That max-size trick at 50 seems to work nicely.

rickmoynihan15:07:00

@gfredericks: so does :max-size 50 mean maps have a maximum of 50 pairs, strings have a maximum of 50 chars, vectors have a maximum of 50 items etc?

rickmoynihan15:07:12

or is it a weight/multiplier?

gfredericks15:07:49

that tends to be true, but there's no true definition of size; it's an abstract thing that could theoretically be interpreted differently by any given generator

rickmoynihan15:07:20

ok that’s what I thought

gfredericks15:07:34

e.g., you can easily defy that by generating vectors with a specified length of 75

rickmoynihan15:07:59

but is it true of the default generators?

gfredericks15:07:43

presumably the fix for all this will be that lower-down nested structures will be substantially smaller

gfredericks15:07:55

but that wouldn't contradict what you just said

rickmoynihan15:07:20

lowering it to 50 seems to work nicely… it also means I can run more generations in the same time which seems like the right trade off for my data.

lwhorton16:07:12

does anyone know a way to express “a map of keys of spec A or B, where A key points to spec A0 and B key points to spec B0”?

(s/def :foo (s/map-of #{:A :B} ...maybe (s/or :a :A0 :B0)
something along those lines?

seancorfield16:07:53

@lwhorton Sounds like a multi-spec to me.

lwhorton16:07:48

heh, back to that ol’ zinger again

seancorfield16:07:17

TBH, I'm not sure what you're actually trying to do, based on what you said. Can you give a concrete example?

lwhorton19:07:29

multi-spec turned out to be the right thing again. thanks!

bbrinck18:07:15

@lwhorton You might also want to use tuples inside a coll-of. Something like (untested):

(s/coll-of (s/or :a (s/tuple ::A ::A0) :b (s/tuple ::B ::B0)) :into {} :kind map?)