Fork me on GitHub
#clojure-spec
<
2019-10-09
>
cjsauer00:10:42

Thanks @favila that’s very helpful >combining spec with datomic, I’ve had more luck specing what you would want to see out of d/pull, because that’s what’s flowing through the functions in an application I’m reaching this conclusion as well. It actually works very well for spec’ing at that “lower level” you mentioned, because :db/ensure can easily make use of pull results.

cjsauer18:10:10

Thinking some more on this. I gave up on trying to use spec “near” the database. The schema is sufficient for specifying data at rest, coupled with some basic structural constraints (e.g. required keys, or in my case, ensuring no cycles are formed). Structural meaning only the relationships are constrained, not their actual values. I think this is what you mean by opaque @favila. With that in mind, spec seems to really flex its muscle upon data in motion; the transitions between valid data-at-rest so to speak… Thinking out loud 🙂

favila18:10:54

Perhaps. I was thinking of it more in terms of speccing the grammar vs analysing the result

favila18:10:52

for a tx, [{:my-map myval}] the spec is really not anything about :my-map, but a vector of things, some of which may be maps, the keys of which are arguments to some transformation

favila18:10:47

so the key is “opaque” in that sense, it’s not specced as itself; map values are independent (gramatically) from the keys

favila18:10:32

think about what this would conform to: it would produce some kind of normalized ast with {:attribute-key :my-attr :attribute-scalar-value myval} for example

favila18:10:34

it’s hard to spec “both” at once, i.e. both the generic grammar and also some specific constraints on terms in it (constraints which would be enforced by an analyzer stage in a typical compiler)

cjsauer18:10:24

I see. So to apply your compiler analogy to an “application”, you’re referring to the fundamental difference between, say, the description of some mutation (as data), and the new database that results (assuming db-as-a-value semantics). The keys in both could also be said to be grammatically separate (?)

favila18:10:34

your key inside the transaction mini-language means something different than your key in a map with your data in it

favila18:10:22

in the first, it’s a parameter to some invisible ast’s map key, in the later it’s the data itself

favila18:10:42

maybe map-as-syntax vs map-as-data

favila18:10:04

you can’t s/keys the keys inside a map-as-syntax, only s/map-of them

💡 4
cjsauer18:10:35

That’s where my mapping of these concepts to spec was failing. s/keys seems to imply that a key must mean the same thing everywhere it’s used, but there’s an assumption that you’ll be checking against map-as-data. It is inappropriate for map-as-snytax. s/map-of was the 💡 for me, the data there is purely syntax.

cjsauer18:10:30

For some reason I seem to have this desire to write “the base specs” of a system, but specs really don’t make sense without a context…

favila18:10:11

yeah, for me that hurts most when the context is the type of record it’s in

favila18:10:11

I keep running into cases where a key has a wider domain, but can be narrowed/refined contextually by the type of its record.

favila18:10:34

this is a common feature of type systems inspired by class based inheritence or xsds

cjsauer19:10:20

Interesting, I haven’t run into that yet. How have you been handling it? Seems you would need some ability to compose spec registries into hierarchies.

cjsauer19:10:34

Or perhaps express that inheritance in the key itself, eg :slack.user/email

favila19:10:56

I think with spec2 schema I may not need it, I may be able to refine the spec inline

favila19:10:08

but I’m not sure, I haven’t tried spec2 at all