Fork me on GitHub
#clojure-spec
<
2016-10-27
>
cddr02:10:29

I sense that there is some profound wisdom in the "informational vs implementation" section of the spec overview but I fear it goes over my head. Is it a hint to users to not do something with spec that might seem tempting? Or is it a note to implementors to stay away from problems that cannot be solved in the general case. Could anyone provide an example of an implementation decision that should not go into a spec.

cddr02:10:39

Also it seems like most people talking about it here are using it to spec inputs/outputs of functions and macros. Is anyone using (or planning on using it) to define system boundaries? The note in the same doc about starting a dialog about semantic change seems particularly relevant to this type of usage.

seancorfield03:10:33

@cddr We're using it to spec system boundaries more than function inputs/outputs.

cddr03:10:00

How do you share the specs between the two systems?

seancorfield03:10:11

We're spec'ing our REST API at the user boundary, the domain model boundary, and the persistence boundary.

seancorfield03:10:33

Right now, those are still in a single process but that is changing.

seancorfield03:10:54

We keep the specs mostly separate from the code. We haven't completely decided how to package that up in libraries for sharing across systems, but we expect to have the spec libraries as dependencies to both systems (everything's Clojure).

cddr03:10:33

Those categories are interesting too. Is there any overlap between domain model and persistence for example?

seancorfield03:10:35

We're mostly focusing on data -- and generators -- rather than functions.

seancorfield03:10:48

@cddr We're still looking at whether we can (mostly) derive one from the other... we've had some success with that but there are... special cases... shall we say.

seancorfield03:10:06

Some parts of our API boundary match our domain model better than others 🙂

cddr03:10:26

Yeah, it seems like it would be easy to generate a spec from some sort of schema like avro, but a bit harder to go the other way.

cddr03:10:02

Thanks for your thoughts. Are you going to be at the conj? I'd love to chat some more about this.

seancorfield03:10:18

We are started with a legacy API and evolving a new one, piecemeal, so we don't have the luxury of anything like Swagger at the moment etc.

seancorfield03:10:01

Yeah, I'll be there. I submitted a talk about this topic -- and it got accepted -- but partly for personal reasons I've had to withdraw it.

cddr03:10:21

Ah well look forward to seeing you then. 🙂

seancorfield03:10:42

I love Austin 🙂

ikitommi12:10:08

@seancorfield @ccdr we have been writing the spec->swagger/json-schema conversion, should help in the docs part. Also, the JSON/String conforming of the specs. Trying to make those transparent.

mpenet12:10:27

@ikitommi anything public ?

mpenet12:10:26

also @alexmiller has been hinting about changes in spec forms (https://clojurians.slack.com/archives/clojure-spec/p1477501089007213), I guess this might (or not) change the situation for these kind of things

mpenet12:10:28

personally I am holding my horses until this stabilizes (I am very much interested in having something solid for swagger, json-schema & all)

ikitommi13:10:51

@mpenet nothing really usable as public. There is stuff under metosin/spec-tools (second take on the ->json-schema & original spike on the dynamic conformation). But just planned to put effort on the next two weeks with these. I'm working on the spec->json-spec & spec->string-spec transformation - e.g. generating differently conforming copies of the specs for different formats/use cases. If that works, will throw the dynamic conformations into trash bin. Coercion rules in both from Schema/Ring-swagger. Were you doing the json schema -> spec side? or the same?

ikitommi13:10:30

and thanks for pointing out Alex’s message. Would really like to hear how thing will evolve.

mpenet13:10:17

Something similar (or not), not sure. We have a thin facade that generate both specs with conformers and json-schemas, this way this shields us from changes in the underlying libs for now. Moderately happy about the current situation, hopefully the changes alex mentioned will make this stuff easier.

mpenet13:10:56

going from spec to json-schema directly is something we tried but this was brittle, tons of possible breakage unless you set some strict rules of what you can / cannot use in spec

ikitommi13:10:49

sounds cool. So, you are describing something in your own way and generate specs out of those? of do you start from plain specs too?

mpenet13:10:14

yes, at the edge we have our own "spec" language

mpenet13:10:22

I think also onyx does something similar

ikitommi13:10:40

do you have something of that in public?

mpenet13:10:46

but hopefully this is temporary

mpenet13:10:11

no not yet, we might release this but no promises.

ikitommi13:10:54

the old (plumatic) schema -> json schema was partly brittle too, and as the json schema (and expecially the swagger version of it) is less powerfull, only parts of the Schemas could be presented in it.

ikitommi13:10:19

what did you find brittle in the direct spec->json-schema conversion?

mpenet13:10:21

indeed. same problem

mpenet13:10:55

the fact that a spec is a predicate makes it too open and hard to "understand", unless you are 100% aware of the limitations you can cause the generation of json-schema to go "wtf is that" to some validators/converters

mpenet13:10:17

dunno what's best between limiting ourselves to a subset our the facade solution. the facade has the advantage of shielding us from potential dependency changes (Schema -> Spec), and allows the devs not to be afraid to break stuff in a dependency they never inspected

mpenet13:10:47

seems like that could be a good solution for compojure-api & related libs imho

ikitommi13:10:00

yes, at least the current spec maps are quite hard to use, was thinking of adding schema-like map -support ~ {::name string?, ::id integer?}

ikitommi13:10:12

in web apis, currently we have lot’s of anonymous maps to describe the different paramerer sets for endpoints.

mpenet13:10:12

I think the validators should be more static (for compojure-api & co), using string? integer? is not so good in that context, just keywords (:string :integer etc) that dispatch to a multimethod that knows what to make out of it and act as some sort of registry that's limited to that dsl. Allowing the user to create schemas and register/use them too in that context since they're known to be understandable by the coercion/validation chain later.

mpenet13:10:52

anyway, I am curious to see what will come next

zane15:10:40

Are conformers intended to be limited in scope? As in, should I use them exclusively for minor transformations, or would it still be idiomatic to use them for larger-scale manipulation?

zane15:10:46

Hope that question make sense. Sorry for being vague.

alexmiller15:10:26

they should be used cautiously, particularly when using them with registered specs as you are making decisions for all consumers of your specs (for now and the future, which is long :)

alexmiller15:10:54

in general, I think generic data transformation is better done explicitly with functions in normal Clojure ways

seancorfield16:10:36

The problem we ran into is that we had a lot of specs for our API that needed to accept a "string that could be converted to something that conforms to another spec" — and so your choice is either to make the spec produce the converted value (using conformer) or run that conversion twice: once to check you have a compliant string, and then again as Clojure code once you have "validated" your string input.

seancorfield16:10:18

Your other choice is to run the conversion as a "pre-spec" layer and deal with conversion failures outside spec… which is a lot of boilerplate.

seancorfield16:10:24

Given Alex’s dark warnings about using conformer "extensively" (which we were), we have sort of called a halt to our spec usage in order to see what falls out as the recommended approach for API specs (i.e., specs that must accept strings that essentially conform to other specs after "conversion").

seancorfield16:10:43

If that recommended approach is "run the conversion twice", well, fair enough, but it seems a bit of a waste of effort.

alexmiller16:10:23

I think perhaps you take my words more apocalyptically than they were intended :)

zane16:10:46

Thanks for the exposition, @seancorfield. That's precisely the situation I'm in.

seancorfield16:10:46

@alexmiller Well, you have repeatedly cautioned against (overuse of) conformer whenever this sort of pattern comes up — and I’m not seeing a recommendation for how to handle it otherwise.

seancorfield17:10:41

For us, having the API spec also handle the conversion (via conformer) is the easiest way to do things. But easy != simple and I understand that it complects validation with transformation...

alexmiller17:10:31

My point is and has always been: you are making a choice for all future consumers of your registered specs and you should think about the ramifications of this before you use conformers on everything