Fork me on GitHub
#clerk
<
2023-03-17
>
Antoine Zimmermann12:03:16

hello, team. First of all thanks for this wonderful tool it's great! I was wondering if if was possible to use clojure-mode to display a code editor instead of just rendering code?

Antoine Zimmermann13:03:11

works out of the box. thanks a lot!

🙌 4
José Javier Blanco Rivero19:03:19

Let's say I setup these code viewers and I edit some code in the browser. How would you eval it? Is is possible?

Sam Ritchie21:03:01

Yeah you have the full SCI environment available

genmeblog13:03:22

I'm thinking about kind of codox replacement done in Clerk. With a little macro it's doable and looks nice for me. The only question is: is it possible to add entries to the TOC? 🧵

👍 2
genmeblog13:03:26

(def source-files "")

(defn args->call [f args] (conj (seq args) f))

(defn make-public-fns-table
  [ns]
  (let [publics (->> (ns-publics ns)
                     (map (comp meta second))
                     (sort-by :name))]
    (clerk/html
     [:div (for [{:keys [name file macro arglists doc line const]} publics
                 :when arglists]
             [:div {:class "pb-8"} ;; add border
              (clerk/html [:span [:b {:class "underline decoration-2 decoration-sky-500"} name]
                           [:sup [:a {:href (str source-files file "#L" line)} "  [source]"]]
                           (when macro [:sup " MACRO"])
                           (when const [:sup " CONST"])])
              [:p]
              [:div (for [args arglists]
                      (clerk/code (args->call name args)))]
              [:p]
              [:div (clerk/md (str (or doc "\n")))]])])))

(make-public-fns-table 'fastmath.core)

genmeblog14:03:57

And the result:

Sam Ritchie14:03:39

Amazing, would love to have support for the [[links]] here too

👍 2
genmeblog14:03:44

Yeah, working on that, it's possible without too much effort I think.

genmeblog14:03:28

Constants also work (and codox MathJax formulas embedded in docs)

genmeblog14:03:50

and the final code snippet

genmeblog14:03:04

(def source-files "")

(defn args->call [f args] (conj (seq args) f))
(defn fix-tex [s]  (str/replace s #"\\\\\(|\\\\\)" "\\$"))
(defn fix-anchor [s] (str/replace s #"\[\[(.+?)\]\]" "[$1](#LOS-$1)"))

(defn make-public-fns-table
  [ns]
  (let [publics (->> (ns-publics ns)
                     (sort-by first)
                     (map second))]
    (clerk/html
     [:div (for [v publics
                 :let [{:keys [name file macro arglists doc line const]} (meta v)]]
             [:div {:class "pb-8" :id (str "LOS-" (clojure.core/name name))} ;; add border
              
              (clerk/html [:span [:b {:class "underline decoration-2 decoration-gray-400"} name]
                           (when macro [:sup " MACRO"])
                           (when const [:sup " CONST"])
                           [:sup [:a {:href (str source-files file "#L" line)} "  [source]"]]])
              [:p]
              (when const [:div (clerk/code (var-get v)) [:p]])
              (when arglists [:div
                              [:div (for [args arglists]
                                      (clerk/code (args->call name args)))]
                              [:p]])
              [:div (clerk/md (->> (or doc "\n") str fix-tex fix-anchor))]])])))

stathissideris16:03:18

hello, when using clerk/vl how do we actually pass it data? is it required to write data out to a CSV file which is then somehow served by clerk server to clerk on the browser?

genmeblog16:03:52

If you have them in clojure var they are inlined in html afair.

stathissideris16:03:20

alright, thanks!

otfrom16:03:57

@U050AACJB some bad helper code I've made for doing line and line/ribbon charts https://gist.github.com/otfrom/2785ab6767d2e887d6f44e357b968f9f (don't know if this is what you need or not)

stathissideris16:03:59

@U0525KG62 thanks! I don’t even know what ribbon charts are, but I’ll study your code closely as I’m just starting out with clerk!

otfrom16:03:35

axes and legends clipped to protect clients. This was tricky due to how vega handles legends

stathissideris16:03:59

oh these look nice

otfrom16:03:04

I do lots of summarising simulations w/medians iqrs 90% ranges

otfrom16:03:20

tc/tmd make it a lot of fun

otfrom16:03:36

and now I can actually do the charts I need in vega-lite

otfrom16:03:50

(thx to @U04JZRX0GV7 for figuring out how to do it)

otfrom16:03:08

all to be published eventually but still alpha and changing too much atm

👀 2
stathissideris16:03:21

I have stuff that takes too long to run (10+ minutes) so I think I’ll isolate these in namespaces that clerk doesn’t watch and use the workbook namespaces for the lighter results exploration/viz part of the job

otfrom16:03:46

yeah, you can always build the notebook off the output of the longer running things

otfrom16:03:59

tho there might be way to make the longer running things go faster 😄

stathissideris16:03:34

not if it’s 1000 requests to a 3rd party service 😅

otfrom16:03:48

ah, no, not that really 😄

teodorlu16:03:17

> I have stuff that takes too long to run (10+ minutes) so I think I’ll isolate these in namespaces that clerk doesn’t watch and use the workbook namespaces for the lighter results exploration/viz part of the job (edited) I was in a similar situation some time ago -- loading lots of usage data from https://www.sanity.io/ to analyze what kind of usage we had over the last few years. I tried using Clerk for everything at first, but wan't happy with the performance. Randomly having to wait a minute or two for the first clerk page load wasn't too nice! I ended up with a solution similar to yours -- splitting the "load all the data" code from "analyze it", and was really happy with that. Pulled out the "load all the data" code as a function I could execute with clj -x my.stuff/load-my-data. Then the clerk notebooks read that data from disk, and were snappy and easy to work with. It was also really nice to have a clean separation between raw data and "the data in the shape that I'd like".

stathissideris17:03:54

That’s nice, and saving to disk is something I also ended up doing. Was the load-my-data part a long doseq?

teodorlu17:03:23

yup! Something like this:

(doseq [y (range 2019 2023)]
  (print "Dumping transactions for" y "...")
  (dump-transactions-year! y)
  (println " done!"))

stathissideris17:03:35

I’m thinking about how long running jobs like that could be monitored on a personal REPL level (rather than monitoring in a production deployment which is a solved problem).

teodorlu17:03:13

oh, gotcha. For me, downloading all the data took about a minute, so perhaps less than you have to deal with. Splitting the data into chunks so that (A) each chunk was quite fast (seconds) to download, and (B) was stored in its own file helped a lot for me. I had lots of data like this:

data-dump/DOMAINTERM-2019.edn
data-dump/DOMAINTERM-2019.json
[...]
My download functions reported some progress (println) as they were going. I could run them from a REPL. In that case, I could cider-interrupt to stop downloading. I got some measure of progress from looking at the folder content.

teodorlu17:03:43

There might be better solutions -- I didn't spend more time than I needed to. I was more curious about what I could learn from the data than the downloading itself.

stathissideris17:03:37

Yeah doesn’t sound like you needed anything more sophisticated than that! I have a more granular list of items

👍 2