Fork me on GitHub
Jack Park17:11:23

Curious: datalog schema catalogs - along the lines of - is there such a thing? Is there an appetite to create a joint repo to collect and edit them?

Jack Park01:12:02

This schema work relates to a long term OSS project in which agents of all kinds which produce the equivalent of update event streams subscribe to Kafka channels and inject their events; all manner of agends can then treat those streams (Kafka topics) as if they are git repos - they can fork and branch and do PRs on them. A large area of that work aims at federation processes. My work is with a project OpenSherlock, which is a kind of knowledge graph which, actually, is a hybrid TopicMap/ConceptualGraph, coupled to an NLP platform for machine reading, question answering, and so forth. For that to work well in a large universe of event producers, it makes sense to advertise a schema catalog (regardless of the language - RDF, Frames, Datalog, ... and see how others choose to participate.

Jack Park01:12:00

It doesn't seem to be a far-fetched concept to take blobs (eg. JSON) and rewrite them, using a collection of rules, to Datalog schema declarations.

Jack Park01:12:49

I use a kind of "upper-schema" to define the core Datom structure of a frame which is a node in the knowledge graph, but there is always the desire to wrap that with schema which define the attributes according to specific domains - at the highest level, People, Places, and Things, but then, that's barely scratching the surface. Apply that system to biomedical research, geophysical event research, politics, etc, and you end up with quite the forest of subject descriptors .