Fork me on GitHub
#rdf
<
2021-02-17
>
rickmoynihan23:02:27

@quoll: FYI I think there’s a small bug in the naga README. The example code will raise an Unknown storage configuration error. I managed to fix it locally by adding the line: (naga.store-registry/register-storage! :asami naga.storage.asami.core/create-store)

quoll23:02:48

Thank you! This recently changed, so it must have gone stale.

rickmoynihan23:02:26

Least I can do

quoll23:02:40

Can I please check… did you include the line: (require '[asami.core :as asami]) ?

quoll23:02:00

Oh, I see what I’ve done. I forgot to include something else

rickmoynihan23:02:43

side effects eh? 🙂

quoll23:02:51

No, when I ran the example code I’d already required a different namespace, and I forgot to include it in the example

quoll23:02:27

Try the example script now

rickmoynihan23:02:44

that :thumbsup: works thanks

rickmoynihan23:02:02

(I meant the side effects to auto register asami on namespace load)

rickmoynihan23:02:36

Incidentally is it possible to essentially do what is in the README, but without using the connection management and mutable database stuff. i.e. to manage asami and naga as pure values myself, or at least put them in atoms I control?

quoll23:02:17

I just added a comment too. That extra line loads the Asami connector. • It registers the factory function • It extends Asami connections to the ConnectionStore protocol • It implements the Naga Storage protocol

👍 3
quoll23:02:43

I’ve kinda pushed the value management into the Connection. The Connection actually refers to all the old values of the database, as well as the latest. So if you called (asami/db connection) before running Naga on it then you’ll get the latest value of the database. Afterward, if you use asami/as-of you can still get that same value. It’s actually transparent inside the Connection object. There’s a vector of every database value

quoll00:02:07

This seemed to be the most sensible way to manipulate the database. After all, Datomic follows the same paradigm, where connections are executed against, and new values of the database are created that can be retrieved from the connection

rickmoynihan00:02:23

I was thinking more for transient usecases, i.e. where you just want to compute a value… e.g. load some triples into asami, expand the graph with naga rules, and then spit the data or a query result out… without having to engage in resource management etc

rickmoynihan00:02:23

e.g. possibly also in the context of a http request… i.e. querying data out of a sparql triple store with constructs, but using asami perhaps with naga in place of a Jena/RDF4j model to build a response.

quoll00:02:00

In that case, I would just use a graph URI with asami:mem:// for the scheme. It’s basically doing exactly what you just said. (admittedly, the asami:local:// scheme is still a work in progress. So you HAVE to use asami:mem:// for now anyway). There’s no “resource management”, except that the connection holds an atom for the vector of DBs, and when you do a transaction like Naga does, then it just calls update on the maps that make up the latest DB, and does a conj to the vector in the connections atom.

quoll00:02:59

My colleagues are doing this all the time. Create a memory graph, throw data into it, and use queries to pull out exactly what they want. Then they throw it all away. 😱

quoll01:02:22

I was a bit shocked to see them do it, but it’s fast, and they find it useful to do!

rickmoynihan09:02:32

moving to #asami

rickmoynihan23:02:03

@quoll: One other thing, it looks like the pabu parser silently fails on the -- comments in the skos datalog example you pasted me. Swapping them out for the c-style ones seems to at least convert the program string into data (not got to trying to run it yet), but should I expect it to work in naga?

quoll23:02:21

You should, but I haven’t done much with pabu for a long time. I thought I handled those comments, sorry

quoll23:02:37

(should these questions be in #asami) instead?

👍 6