Fork me on GitHub
#sci
<
2021-03-02
>
mkvlr15:03:00

@borkdude there’s no way to also read and get information about comments ; in edamame, is there?

borkdude15:03:29

@mkvlr can you explain why you would want this?

mkvlr15:03:36

@borkdude literate programming in which the comments get parsed as markdown as explanatory between the code

mkvlr15:03:55

rewrite-clj does return comment nodes but I like edamame’s api better

borkdude15:03:26

@mkvlr I guess we could do something here: https://github.com/borkdude/edamame/blob/9823a7af0edab2cf8596e345b6ea50979f1e9149/src/edamame/impl/parser.cljc#L538 but the question is: what would you return when parsing (defn foo [] ;; nice comment \n 1)

3
borkdude15:03:05

Right now this returns (defn foo [] 1)

borkdude15:03:18

if you stick to a strict format, e.g. ;; only at the start of a line, you can probably parse this manually

mkvlr15:03:56

this would have to be an opt-in thing anyway, right? Maybe the caller could decide what to turn this into? but not sure

borkdude15:03:29

but what would you want to return? this can mess up the sexpr pretty badly probably

mkvlr15:03:34

did you use rewrite-clj for deps.edn rewriting?

mkvlr15:03:12

ah ok, I had assumed this must be possible within the borkdude ecosystem 😼

borkdude15:03:29

the borkdude ecosystem is not a closed world ;)

mkvlr15:03:39

but it’s pretty complete

borkdude15:03:48

clj-kondo also uses rewrite-clj

borkdude15:03:03

@lee knows what to use nowadays (I lost track of all his rewrite-clj repos)

mkvlr15:03:59

thinking of wanting to get (comment ";nice comment") or (line-comment ";nice comment") back maybe

borkdude15:03:35

@mkvlr I can see it being useful to return comments when you are on the top-level but within balanced { .. } for example, I can see this going wrong, like {:a 1 ;; foo \n :b 2} => incorrect map

lread15:03:24

I started in lread/rewrite-cljc-playground and am migrating that work into clj-commons/rewrite-clj v1 branch which will be merged to clj-commons/rewrite-clj main.

mkvlr15:03:49

@borkdude ah right, guess top level only would be fine for my use case but unsure if it’s worth it then…

borkdude15:03:43

@mkvlr if top-level, I think this is pretty easy to parse manually, with (str/starts-with? (str/trim ...) ";")

mkvlr15:03:20

@borkdude true, I guess rewrite-clj is also fine

Huahai22:03:41

@borkdude I am using sci in datalevin command line tool, it works for most part, however, there are some datalog queries that work in JVM do not work in sci

Huahai22:03:05

No implementation of method: :-find-vars of protocol: #’datalevin.parser/IFindVars found for class: datalevin.parser.Constant

Huahai22:03:06

Is there anything special needed for loading the namespaces?

borkdude22:03:36

Hey! I'm not sure what this means without having some kind of repro

borkdude22:03:28

You can point me to your repo and I can have a look if you can make some kind of failing test

borkdude22:03:37

and instruct me how to run that

Huahai22:03:53

this is what i currently doing, i expect it won’t work as I am not using sci namespaces, but I also tried using SciNamespace and SciVar, still doesn’t work

Huahai22:03:51

i saw that your datascript feature is using SciNamspace and copy-var, so I tried to do that, but still no luck

borkdude22:03:14

> I can have a look if you can make some kind of failing test

borkdude22:03:14

Are you trying to expose a protocol to the sci scripts?

Huahai22:03:33

no, i am just trying to run functions in sci

borkdude22:03:36

What is the example script that manifests this error?

borkdude22:03:51

(or REPL input for that matter)

Huahai22:03:07

the kind of datalog query that involve (pull …)

Huahai22:03:31

normal queries work fine

borkdude22:03:36

I need a complete example / test so I can have a look. If you can provide me with details instructions how to clone and run that, I'll take a look tomorrow

borkdude22:03:57

I have already tried your native binary locally, that worked for me

borkdude22:03:03

Very cool stuff btw

Huahai22:03:09

sure thanks, i will create a test case

borkdude22:03:23

is pull a macro perhaps

Huahai22:03:47

yeah, it works for most part, only some queries that don’t work

Huahai22:03:58

yes, a macro is involved i think

borkdude22:03:04

ah, macros need special attention

Huahai22:03:13

that’s what i suspected

borkdude22:03:30

can you show me the macro expansion of the failing pull query?

Huahai22:03:49

#?(:clj (defmacro deftrecord “Augment all datalevin.parser/ records with default implementation of ITraversable” [tagname fields &amp; rest] (let [f (gensym “f”) pred (gensym “pred”) acc (gensym “acc”)] `(defrecord ~tagname ~fields ITraversable (~’-postwalk [this# ~f] (let [new# (new ~tagname ~@(map #(list ‘datalevin.parser/postwalk % f) fields))] (if-let [meta# (meta this#)] (with-meta new# meta#) new#))) (~’-collect [_# ~pred ~acc] ;; [x y z] -&gt; (collect pred z (collect pred y (collect pred x acc))) ~(reduce #(list ‘datalevin.parser/collect pred %2 %1) acc fields)) (~’-collect-vars [_# ~acc] ;; [x y z] -&gt; (collect-vars-acc (collect-vars-acc (collect-vars-acc acc x) y) z) ~(reduce #(list ‘datalevin.parser/collect-vars-acc %1 %2) acc fields)) Traversable (~’-traversable? [_#] true) ~@rest))))

Huahai22:03:14

`#?(:clj (defmacro deftrecord "Augment all datalevin.parser/ records with default implementation of ITraversable" [tagname fields &amp; rest] (let [f (gensym "f") pred (gensym "pred") acc (gensym "acc")] `(defrecord ~tagname ~fields ITraversable (~'-postwalk [this# ~f] (let [new# (new ~tagname ~@(map #(list 'datalevin.parser/postwalk % f) fields))] (if-let [meta# (meta this#)] (with-meta new# meta#) new#))) (~'-collect [_# ~pred ~acc] ;; [x y z] -&gt; (collect pred z (collect pred y (collect pred x acc))) ~(reduce #(list 'datalevin.parser/collect pred %2 %1) acc fields)) (~'-collect-vars [_# ~acc] ;; [x y z] -&gt; (collect-vars-acc (collect-vars-acc (collect-vars-acc acc x) y) z) ~(reduce #(list 'datalevin.parser/collect-vars-acc %1 %2) acc fields))` Traversable (~’-traversable? [_#] true) ~@rest))))

Huahai22:03:20

sorry, this is messed up

Huahai22:03:47

that’s the macro responsible to flesh out all the defrecord in parser.clj,

borkdude22:03:24

I see yes. So:

(deftrecord Aggregate [fn args]
  IFindVars (-find-vars [_] (-find-vars (last args))))
expands in some defrecord call which needs the IFindVars protocol to be around. This is where it gets a little hacky, but you can make this work. In sci, due to GraalVM limitations, protocols are implemented as multimethods. E.g. this is how Datafiable is added to the sci config of babashka: https://github.com/babashka/babashka/blob/bbf144fbce66c6986253119eb81392a440cb17c6/src/babashka/impl/protocols.clj#L33

Huahai22:03:42

so i need to do the same for IFindVars?

borkdude22:03:32

if you want that macro to work, yes

borkdude22:03:35

$ bb -e "(require '[clojure.core.protocols :as p] '[clojure.datafy :as d]) (defrecord Foo [] p/Datafiable (datafy [_] :foo)) (d/datafy (->Foo))"
:foo
 

borkdude22:03:28

It magically works, but it works around the byte code restriction by using multimethods. This is a little bit undocumented right now and the implementation might change in the future

borkdude22:03:10

So why do users need to use (deftrecord ...) in their scripts/REPL?

Huahai22:03:48

not users, but the parser does that

borkdude22:03:59

at REPL-time?

Huahai22:03:14

right, the query has to be parsed

borkdude22:03:27

and then it defines these record types during parsing?

Huahai22:03:52

that’s an interesting question, maybe they should not

Huahai22:03:11

because those record type should be compiled

Huahai22:03:21

but somehow they didn’t

borkdude22:03:31

yeah, it seems a little odd. a parser should just take some data in and produce some other data probably

Huahai22:03:00

it’s the same as the datascript one, they just extract that out i believe

Huahai22:03:09

ok, let me figure it out, it looks like a compilation problem

borkdude22:03:24

ok, that would simplify stuff a lot

Huahai22:03:22

i will try the macro fix, thanks for point me at that

👍 3
borkdude22:03:01

I'm off to bed now, will read again tomorrow. Exciting stuff, looking forward to using datalevin from babashka scripts :)

Huahai22:03:13

thanks, have a good sleep!