This page is not created by, affiliated with, or supported by Slack Technologies, Inc.
2016-07-21
Channels
- # admin-announcements (4)
- # beginners (41)
- # boot (46)
- # cider (8)
- # clojure (132)
- # clojure-austin (15)
- # clojure-belgium (3)
- # clojure-greece (3)
- # clojure-hk (1)
- # clojure-mexico (4)
- # clojure-quebec (5)
- # clojure-russia (46)
- # clojure-spec (225)
- # clojure-taiwan (1)
- # clojure-uk (17)
- # clojurescript (46)
- # clojurewerkz (1)
- # core-async (28)
- # cursive (9)
- # datascript (3)
- # datomic (5)
- # defnpodcast (42)
- # devcards (60)
- # emacs (27)
- # hoplon (7)
- # lein-figwheel (5)
- # leiningen (12)
- # mount (8)
- # om (13)
- # play-clj (2)
- # reagent (47)
- # rethinkdb (5)
- # ring-swagger (7)
- # spacemacs (9)
- # specter (12)
- # testing (1)
- # untangled (1)
- # vim (11)
- # yada (31)
@alexmiller here's an interesting question about spec naming conventions in namespaces that exist primarily to provide specs: https://github.com/gfredericks/schpec/pull/1/files#r71633898
i.e., is there a difference between the real specs and the specs describing args to the functions to make specs?
I'm imagining maybe namespacing the function arg/ret specs one level deeper, using the name of the function as the last component of the namespace
which of course immediately brings up the old "aliasing a non-code namespace"
problem
If s/keys
allowed anonymous specs for :req-un
and :opt-un
, that would be the obvious choice, but absent that, using the fn name as the last component of the spec namespace is p reasonable
oh hi donaldball
howdy, stranger
donaldball: would you say the main point of these specs is combining range-checks with type-agnosticism?
I think of them as more describing domains in the math sense, but we’re probably saying the same thing
sooner or later we are going to have a record like map spec that doesn't require namespaced keys
I’ve written a few score specs now and am tentatively concluding that Rich’s encouragement towards namespaces keys is tremendously valuable for describing domain model maps, and not very useful, maybe even a bit abrasive, when describing function option maps or kwargs.
That’s an interesting idea, I haven’t tried it in that way yet
Yeah, it’s more in the spirit
If you had a bunch of fns that shared many of the same options, maybe a spec registry would be called for
kwargs do allow duplicate keys anyway, so your formulation may be more correct
(even though it’s probably a usage bug when it happens)
schpec.bhauman
bhauman: so the whole point of your comment code is to avoid the namespacing-requirement of keys*?
I bet that spec conforms a lot more awkwardly than keys* does :/
but I'm just trying to use the whats available to avoid registering in situations that don't merit it
we could make a macro that names them in the background with gensyms or something
by using keys* or by avoiding it?
less readable, conforms weird
I think that's it
well more readable in one sense I guess
keys* requires you to go somewhere else for the rest of the spec
you can change the conforming with a macro?
I'm still hazy on all the details of clojure.spec
man I gotta learn this stuff better
so you could take the sequence of pairs that your style of regex creates and conform that further into a map?
this has been another episode of Turing Completeness with Bhauman
I'm interested in how useful conformers are for dealing with transforming things in and out of json and jdbc
I'd like to be able to precising describe the idiosyncratic shape of some funky json at the same time as describing the pristine clojure-friendly equivalent and automatically get functions to translate between them
ditto for goofy jdbc types
that's all I wanted to hear
schpec.bijections
s/precising/precisely/
clojure.spec
FTW: as we’re specifying more of our domain model and rewriting our validation logic on top of s/conform
, we’re discovering some interesting edge cases that we hadn’t considered before, as well as uncovering inconsistencies between how we treat certain pieces of data, in different parts of our application. It’s very enlightening… and it’s also making us feel a lot more confident in our updated validation code!
Not all of our validation is suitable for clojure.spec
tho’: we have several places where the validation is contextual, based on current data from the environment, but all of the basic "shape" validation is much nicer now.
unrelated: I’m all for the openness of maps etc, but I’ve run in to some real bugs where typos or misplaced data gets ignored. Is there anything currently or planned for identifying “extra” data? I’ve found it incredibly useful to log unknown data in the past.
As an example from a Go program I worked on recently, we were parsing a Toml config file and our tester lost a few hours to a flag that was in the wrong section. So I added a call to this “Undecoded” method: https://github.com/BurntSushi/toml/blob/99064174e013895bbd9b025c31100bd1d9b590ca/decode_meta.go#L113 — logged all the undecoded keys and he never had this problem again
@bbloom: I think the only solution right now is s/and
with a set operation on the keys
of the map?
@seancorfield: gotcha, thanks. was hoping not to treat it as invalid only to discover that it exists
Yeah, if you're dealing with just :req
keys you're OK but once you get into optional key territory it's tricky 😞
something like s/conform but like s/unconformed that returns paths to all the data that was ignored
another case like this in our context: maps as database records, an invalid map-entry could cause a query execution exception, invalid column etc (a map missing a key because of the typo is valid too, which doesn't help)
@seancorfield: do you have an example of this keys+and code?
@mpenet: I think it'd be like (s/and (s/keys :req-un [::foo]) #(every? #{:foo} (keys %)))
Good 15 minute screencast by @stuarthalloway: http://blog.cognitect.com/blog/2016/7/13/screencast-spec-leverage
@seancorfield: wouldn't this fail https://github.com/clojure/java.jdbc/blob/f9fe3dde5b4bc5f1a5b5a79ffb5374fad3377b0b/src/test/clojure/clojure/java/test_jdbc.clj#L31 ?
" Instruments the vars named by sym-or-syms, a symbol or collection of symbols, or all instrumentable vars if sym-or-syms is not specified."
I guess you need to call ns-publics and then pass the coll to instrument, or just call instrument without arg
(s/explain-str (s/cat :statements (s/+ string?)
:type keyword?)
[["q"] :logged])
--> "In: [0] val: [\"a\"] fails at: [:statements] predicate: string?\n"
this is part of my attempt to debug https://github.com/mpenet/alia/blob/feature/specs/modules/alia-spec/src/qbits/alia/spec.clj#L296-L299 (the rest of the specs seem fine so far)
replacing (s/+ string?) by (s/coll-of string? :min-count 1 or (s/spec (s/+ string?)) works, but I am not sure why "+" failed here
"When regex ops are combined, they describe a single sequence. If you need to spec a nested sequential collection, you must use an explicit call to spec to start a new nested regex context." okay
@bbloom: regarding discovering typos/etc in open maps, I've been pondering the idea of a pair of schemas -- the normal one, for which violations are a bug/error-condition/whatever, and a more restrictive one, for which violations are merely suspicious, and might warrant logging, etc.
E.g. "this string is probably a uuid", "this map probably has no unknown keys", ...
@bbloom: your conform/unconform idea suggests that you could just roundtrip the data and do a diff, but that would only work with map keys and not general mistakes I think
@gfredericks: something that could be nice for speck would be a schema->spec converting function/macro. When you have large'ish Schemas (ex multi-level maps) it get tedious/verbose fast. I might actually give it a try if I get some free time this week end
By "schema" you mean something like plumatic/schema's api where the schema is the same shape as the data?
Yeah I definitely imagined that would show up
Oh you mean their whole API?
Or as much as possible at least
A 1-time dev utility or a runtime thing?
The former can be less robust
In my case I am mostly concerned about the ton of map schemas we have in our projects, would be nice to be able to port these without spending ages on it and laying a bug field in the process
Just punt on certain things and point the user to where it's broken
the example of nested maps is a good one I think, a very concise schema from plumatic Schema can end up being a tons of lines of mostly s/def's for k/v
It definitely would be
Hi ... I've just been looking at spec (alpha10) and am confused by some results I'm seeing ...
Not by itself
You have to call c.s.test/instrument
Once registered, function specs are included in doc, checked by instrument, tested by the runner clojure.spec.test/run-tests, and (if a macro) used to explain errors during macroexpansion.
"Checked by instrument" is a reference to the function I mentioned above
The wording is confusing though since that's not obvious
@gfredericks: I have a strict-keys macro that uses a dynamic variable to set it's level of strictness
@bhauman: interesting
ok ... so I need also need to include clojure.spec.test in my production code to have specs checked at runtime?
@bhauman: I'm in the middle of pushing more customization into plumatic/schema to support this kind of thing
@gfredericks: thanks
I think a better solution is to have both the dynamic variable with a spec level override to force a certain level for certain specs
@l0st3d: I think the intention is for you to do more explicit checks if you want production checking
@bhauman: cuz e.g. defproject is always open but certain submaps are more restrictive?
@gfredericks: fair enough ... just thought that the docs suggested it would do those checks for me if I included the fspecs ... just trying to work out the api atm i think 😉 .. thanks for your help
This could frankly be done all the time for something like (s/keys ) where a warning will be generated
but being able to set a dynamic var to adjust this level of checking seems appropriate
i’m struggling with a recursive map spec. what i’m trying to do is something like this:
(s/def ::node-content ::series)
(s/def ::node-inputs ::graph-seq)
(s/def ::graph-node (s/keys :req [::node-content ::node-inputs ::label ::stream]))
(s/def ::graph-seq (s/coll-of ::graph-node))
i don’t quite get how i can have a key spec referring to having a required key that is related to itself..
(s/def ::label keyword?)
(s/def ::children (s/coll-of ::node))
(s/def ::node (s/keys :req [::label] :opt [::children]))
(s/conform ::node {::label :a ::children [{::label :b} {::label :c}]})
=> #:user{:label :a, :children [#:user{:label :b} #:user{:label :c}]}
there’s a simple example - I can’t quite work out what you’re trying to do
first time i tried something like that i thought i got a complaint about the spec not existing.. think was because i didn’t have ::graph-seq and ::node-inputs in the right order.
@mpenet: My understanding is that, despite the docstring, you can use instrument
to turn on instrumentation for an entire namespace using that call.
seems it might have changed here: https://github.com/clojure/clojure/commit/a4477453db5b195dd6d1041f1da31c75af21c939 at least that's what the docstring would suggest
Yeah, something definitely isn’t working right now — I changed an fdef spec and the tests still pass… investigating.
It definitely used to work.
@alexmiller: would it be a bug or is it an intentional change?
that’s intentional - use st/enumerate-namespace to produce a list of syms in an ns
Hmm, I missed that change in the release notes @alexmiller
Although now I can’t get my fdef
specs to work at all
for check
or instrument
?
I have just tracked down a problem with check
When I call fdef
, the spec does not subsequently show up on doc
...
how are you calling it? expects a fully-qualified symbol
actually, it resolves, so not necessarily fully-qualified
I had (s/fdef drop-table-ddl …)
after referring in :all
and that used to work but doesn’t now. I changed it to (s/fdef clojure.java.jdbc/drop-table-ddl …)
and it works now.
yeah, if you use a bare symbol for def or fdef, it will treat that as <current-ns>/sym
so it’s not going to pick up refer's
That changed at some point. This code used to work.
(not a big deal but the silent "failure" is disturbing)
fdef stuff changed around 6/7 - there were some major rewrites of it in there
you can use st/instrumentable-syms
to verify that things are speced maybe?
with these changes - if you want to run your clojure.test tests with specs instrumented and checked for fdef’s, what’s the proposed setup?
you’ll need to turn on instrumentation
so you can do that per-test, in a fixture, etc
if one would like to turn on instrumentation for “everything” in a fixture - does one have to manually enumerate the NS to then do instrumentable-syms and finally instrument? i.e. there’s nothing that just turns on instrumentation for all NS in project?
just (st/instrument)
is supposed to do that
that’s like the old instrument-all
and macro fdefs are always instrumented in macroexpansion
Here’s what I ended up with in java.jdbc:
(try
(require 'clojure.java.jdbc.spec)
(require 'clojure.spec.test)
(let [syms ((resolve 'clojure.spec.test/enumerate-namespace) 'clojure.java.jdbc)]
((resolve 'clojure.spec.test/instrument) syms))
(println "Instrumenting clojure.java.jdbc with clojure.spec")
(catch Exception _))
And I had to qualify all the symbols in clojure.java.jdbc.spec fdef
calls in order to get those working again.
Life on the bleeding edge … 🙂
Thanks @mpenet for the heads up on that — I hadn’t noticed it was broken!
in my app is use GPS data. Normally my data structure looks like: {:lat 33.3 :lng -129.3}
. With spec things are looking like: {:pc.api/latitude 33.3 :pc.api/longitude -129.3}
. I'm finding it less fun to type that everywhere. I guess I could alias my namespace to [pc.api :as a]
and change latitude and longitude to lat/lng
then get {:a/lat 33.3 :a/lng -129.3}
. R others doing/finding something similar? I wonder if having keywords as short as lat/lng leads to being unclear? Thoughts?
You can use unqualified keywords in maps if you want @fenton
:) @seancorfield i looked a bit at clj.jdbc to spec parts of alia, that s how i spotted this
I’m cutting 0.6.2-alpha2 with those updates.
@seancorfield: do you mean un-qualified?
@seancorfield: I guess I was suggesting the ns qualified keywords seem a bit of a pain to type everywhere...maybe I should look into using un-qualified keywords....not sure the implications off the top of my head... I'm thinking, how do you use unqualified keywords in other files? The way I'm using specs is to put them into a third *.cljc file that is shared by my front and backends. So dont i have to use qualified keywords in that case?
Yes, sorry, typo.
Fixed 🙂
I keep my specs in an external library...does that make un-qualified an issue or can i still have them defined elsewhere?
Well, you can do #::a{:lat 33.3 :lng -129.3}
@seancorfield: okay that looks a bit better....
A map does not need qualified keys in order to be used with spec.
(s/keys :req-un [::lat ::lng])
will conform {:lat 33.3 :lng -129.3}
@seancorfield: oh really? how does spec use it then?
https://clojure.org/guides/spec#_entity_maps gives examples.
Scroll down to where it shows :req-un
being used.
@seancorfield: I'm unfamiliar with the #::a{:lat 3.3 :lng 3.3}
syntax. i.e. the pulling of the namespace out in front of the map. is there some documentation about that somewhere?
and more officially http://clojure.org/reference/reader#_maps
@jjcomer I can explain what you’re seeing now btw. spec.test/check
returns a lazy sequence of test results per sym. In your test, you are not realizing those results, then immediately uninstrumenting them, then later realizing (actually running the check on the uninstrumented functions). If you wrap a doall
around your two calls to check
, that addresses why you’re not seeing the instrumentation.
while check
does document this laziness, I admit it was a surprise to me.
and then the other thing was not including the sym you’re replacing in the instrument list
np, thx for the repro
If you define a spec with a custom generator (using s/with-gen
), does clojure.spec
filter the generated values using the spec itself, or does it trust that the generator will only ever produce conforming values?
(I ask because I have a spec with a very complex validation predicate and so I have to write a custom generator and I’m not certain whether the generator I’ve written is only going to produce conforming values by itself…)
It does not trust and always re-checks the custom gen
It's not filtering though - it will error if the gen produces a bad value
@alexmiller: in http://clojure.org/reference/reader#_maps, an example of the #::
behavior would be helpful. I think it’s accurate as it is, but not particularly clear.
Thanks @alexmiller that’s good to know. I couldn’t produce a bad value in 10,000,000 generations so I think I’ll trust it as working for now 🙂
@glv Will consider